Next Article in Journal
The Influence of Dome Geometry on the Results of Modal and Buckling Analysis
Next Article in Special Issue
Prospect of LNG as Marine Fuel in Indonesia: An Economic Review for a Case Study of 600 TEU Container Vessel
Previous Article in Journal
Characterization of the Superplastic Magnesium Alloy AZ31 through Free-Forming Tests and Inverse Analysis
Previous Article in Special Issue
Maximum Safe Parameters of Ships in Complex Systems of Port Waterways
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Determination of Benefits of the Application of CMMS Database Improvement Proposals

Faculty of Maritime Studies, University of Split, 21000 Split, Croatia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(4), 2731; https://doi.org/10.3390/app13042731
Submission received: 31 January 2023 / Revised: 18 February 2023 / Accepted: 19 February 2023 / Published: 20 February 2023

Abstract

:

Featured Application

This article concludes the study of CMMS databases and the measures the authors developed to improve data quality in these databases. It includes a calculation of the benefits that the proposed measures can have for improving data quality in CMMS databases. The proposed measures have already been published in several articles.

Abstract

Computerized maintenance management systems (CMMSs) are software packages that support or organize the maintenance tasks of assets or equipment. They are found in the background of any ship maintenance operation and are an important part of maintenance planning, spare parts supply, record keeping, etc. In the marine market, there are a number of CMMSs that are competing fiercely to program a better and more modern program that will capture the market, which has been accompanied by published analyses and scientific papers. At the same time, the quality of the data entered into CMMS databases is questionable, a fact that has been ignored in practice and scientific circles; until recently, there were no published analyses and there was no way to measure the quality of the data entered. This article presents two proposals for improving the quality of CMMS databases and calculates their potential benefits. By implementing the first proposal, the evaluation methodology for the ship’s Planned Maintenance System database, between 10% and 15% of databases will have significant financial or safety benefits. This measure will also have an impact on more than 40% of the other databases that can also be improved. The second proposal will have a smaller impact of only 4%. The overall benefit of these proposals is to improve more than 60% of the databases and will result in a significant increase in safety or financial savings.

1. Introduction

Ship maintenance is one of the most researched topics in the industry, and numerous articles have been published on its various aspects [1,2,3]. An important part of the organization of successful maintenance is performed with the help of CMMS (Computerized Maintenance Management System). The term started long ago as a simple Planned Maintenance System (PMS) and gradually evolved into computerized systems with many modules and multiple functions. Today, there are many different computer programs for CMMS in the maritime industry, the total number of which is estimated to be more than 70. These systems differ in design, quality, and functionality.
PMS and CMMS as tools to reduce downtime and maintenance costs have been widely researched [4,5,6]. Research has shown that the adoption of PMS brought tremendous financial and safety benefits, and the adoption of CMMS continued this process [7]. At the same time, it is very difficult to find data to measure the benefits that have resulted from the introduction of both systems. The rare values published in scientific articles vary considerably, explaining improvements in maintenance from 30 to 50% (variations of more than 50%) depending on the example (case studied) [8,9].
PMS in paper form was a significant step in improving maintenance and enhancing ship safety. The introduction of CMMSs in shipping brought improvements in terms of ease and speed of communication with the office, easier monitoring of maintenance and procurement, or simplicity of data exchange. Since the communication is mostly done via a satellite link, the size of the exchanged data packets must be very small, usually less than 200 kb [10]. The size of the data packet rarely exceeds the specified values even in the case of major changes to the database. This small size of data packets allowed the introduction of CMMS applications running in the cloud and becoming more and more popular in this market [10].
CMMSs or, as they are also known, computerized Planned Maintenance Systems (PMSs) are in daily use on a wide variety of ships. Although they are widely used, there is neither adequate scientific follow up of these systems nor systematic analysis of the systems and their data. Planned maintenance in shipping was addressed in scientific articles in the late 20th century, mainly in Europe and North America [11,12,13]. The research topics at that time focused mainly focused on the application of CMMS and aspects of the system used. Today, authors still analyze and research similar topics [14,15]. Another frequently researched topic is the performance of different CMMSs and their comparison [16,17].
Alan Mortimer, a former UK Chief Engineer, echoing various opinions on the quality of CMMS, wrote: “Commercial Planned Maintenance (PM) systems are a collection of very variable beasts, some good, some bad, and some indifferent” [18].
Although this opinion is widely held in the maritime industry, it is hard to believe that there are products (in this case, computer programs) on the commercial market (i.e., they have survived competition) that are poor and do not meet the needs of users. According to this statement, the research team assumed that the cause of the problem can be found in different places and consists of two known facts. In their research, the researchers came across two claims that together describe the problem much better. The first possible cause is declared by Davies, who states that computerization of poor management systems only leads to poor results more quickly [19]. The second possible cause is the well-known fact of the GI–GO (Garbage In–Garbage Out) effect, which is well described in the article by Kilkenny and Kerin [20].
This assumption that poor databases are the root cause of all CMMS problems formed the basis for the research conducted by the CMMS research team at the Maritime Faculty in Split. A large number of CMMS databases had to be examined and analyzed to verify this assumption. The quality of databases and their impact have been studied by many authors [21,22,23,24], but only for the land industry. This research topic is very limited or non-existent in the maritime industry.
The first discovery at the beginning of the research was that a large number of ship databases have very poor data and numerous problems. At this point, the team faced a major problem, a major challenge to solve. Although it was clear that the databases were in poor condition, their conclusion was based only on subjective opinion and personal experience. There was no tool or method in the industry to evaluate CMMS databases, measure their quality, identify areas for improvement, and determine the steps needed to improve database quality.
To solve the problem, the team’s first task was to develop a universal tool to assess the quality of the CMMS database. The main method used to create the new tool was DQA (Data Quality Assessment), shown in Figure 1, which is based on the idea designed by Pipino et al. [25].
DQA is a methodology developed to provide the general principles for the definition of data quality metrics [26] and the method; according to the authors of the cited text (Batini et al. and Ballou et al.) [27], the main characteristic of the methodology is that it is tailor-made, created specifically for each task. The solution where “one size fits all” in different circumstances cannot be a solution [26]. There are many examples of the DQA methodology in practice and the use of the methodology for different aspects and different types of research [28,29].
The research team encountered an interesting problem in studying databases to determine how the database improvement proposal program works. In examining 17 vessels from two companies, seven similar improvement requests were found on three vessels, each claiming that there were no manufacturer’s maintenance schedules on board and requesting that the company provide them. The number of improvement requests for this type of deficiency is relatively low, mainly due to the fact that both companies only purchase new vessels. This type of issue often occurs when a company buys a used vessel and the previous crew takes all the operating manuals with them, along with the maintenance logs, data, etc., so the new crew starts from scratch, often without the manufacturer’s operating manuals. These seven deficiencies were identified during the CMMS system implementation phase and then reported to the company, which worked to correct them. Five of these deficiencies were successfully corrected, while two were not. The reason for the failure to correct this issue was not identified, although the company’s SMS was reviewed to determine whether it contained instructions or recommendations for correcting this deficiency.
Consequently, in five out of seven cases, the maintenance plan and spare parts were added to the CMMS by copying the data from the manual received, while in two cases, the items were still missing. Reviewing various articles and books, the research team found that no one has yet answered the question of how to create the equipment maintenance plan without using the manufacturer’s manual.
From the above, it can be concluded that a significant improvement in database quality (read: maintenance and safety) can be achieved if these two database problems are solved. These tasks are the focus of the research team, and this paper presents the potential benefits of these two solutions. The design and methods used to create the Evaluation Methodology for the Ship PMS are described in Section 2, while the methods used to solve the second problem are explained in Section 3. The results and discussions are presented in Section 4, followed by the Conclusion, which summarizes the overall benefits of applying these proposals and highlights the importance of the research.

2. The Evaluation Methodology

Another example of the application of the DQA methodology is the evaluation methodology for the ship’s Planned Maintenance System database. The methodology was developed at the beginning of the research to establish firm rules for CMMS database assessment. All DQA assessment strategies [26] were considered when creating the methodology [30]:
  • The acquisition of new data;
  • The standardization (or normalization);
  • The acquisition of links;
  • The integration of data and schemas;
  • The trustworthiness of the source;
  • The localization and correction of errors;
  • The cost optimization.
A tool called the Evaluation Methodology for Ship PMS [30] was developed and field tested to verify its functionality. It consists of the questionnaire with thirty questions divided into six groups: Machinery and Equipment, Jobs inside DB, Special jobs and Rules, DB Jobs general, Spare Parts, and Miscellaneous (Table 1). In front of each question, there is a field (mark) indicating the importance of the question for the quality of the database. The “traffic light” principle (R, Y, G) is used to determine the colors in the field and to describe the importance of the question; red mark has the highest importance, and the deficiencies revealed by these questions have a significant impact on the quality of maintenance. Any deficiencies uncovered by these questions should be taken seriously and corrected to improve the database and the quality of maintenance. The questions with the yellow mark are of medium importance. This group of questions has a lower impact on database quality, and the deficiencies revealed by these questions mainly impact user workload, while the impact on maintenance quality and reliability is negligible. The deficiencies revealed by these questions should be corrected due to unnecessary work of staff [31], which may cause aversion to the system. The questions should be answered with a mark from one to five.
The marks should have the following meaning:
  • Mark 1—Completely negative evaluation result;
  • Mark 2—Predominantly negative evaluation;
  • Mark 3—Predominantly positive evaluation with a significant number of irregularities;
  • Mark 4—Predominantly positive evaluation with a small number of irregularities;
  • Mark 5—Completely positive evaluation.
Questions rated five and four are considered satisfactory and require no changes to the database. Questions rated four have room for improvement, but DB changes are not recommended (there will be no significant quality improvement). Questions rated three, two, or one are considered unsatisfactory and data improvement should be made here. The schedule for data changes in the database should correspond to the color schedule (R, Y, G).
After the development of the methodology, serious efforts were made to test it in practice and to study various aspects of its application. The methodology was used (from 2017 to 2019) to analyze the state and quality of forty-four CMMS databases in five different shipping companies operating different types of vessels (one company operates passenger vessels, two companies operate a mix of bulk carriers and tankers, one company operates bulk carriers, and one company operates VLCCs). Testing of the methodology in different companies, with different working practices and methods, and on different types of vessels has shown that it can be used as a universal tool for evaluating CMMS databases and paper-based PMSs.
After testing, the following claims about the methodology were made and verified:
  • The methodology is a useful tool for evaluating CMMS data and databases [30];
  • The methodology is easy to use [30];
  • The results obtained are reliable [30];
  • The application of the methodology reduces the subjectivity of the evaluator [32];
  • The application of the methodology facilitates the evaluation of databases [32];
  • The application of the methodology makes the evaluation much more detailed [32];
  • The application of the methodology facilitates the identification of deficiencies [32].

Evaluation Results

The results of the evaluation of forty-four CMMS databases were published in the article [33]. The testing of the functionality of the methodology is described in the same article and an analysis of the related results is presented. Further analysis of the obtained results was not performed, nor was an analysis of the identified deficiencies. Therefore, the necessary conclusions for maintenance planning that could affect the quality of maintenance were not derived from the evaluation. The deficiencies identified during this evaluation are listed in Table 2 and Table 3.
Table 2 and Table 3 reflect this breakdown and represent a cumulative analysis of the identified deficiencies. Each row represents a database, while the columns reflect the total number of deficiencies identified, sorted by the scores obtained and indicated by the color of the group.
In accordance with the recommendations for the application of the methodology described above, all deficiencies rated as four are considered minor and no action is required to correct them. Notwithstanding the fact that no action is required, the CMMS can still be improved in these areas. Table 2 and Table 3 show that there is not a single area where no deficiencies were identified, i.e., areas can be improved. At the same time, the lowest number of deficiencies was found to be four, and this was in only one database.
Since the scoring methodology recommends ignoring all items rated four (i.e., there is no need for improvement actions in these areas), new tables have been created (Table 4 and Table 5) that include only deficiencies rated three or worse, and all green and yellow boxes have been removed. These tables still contain a very large number of databases and a large number of deficiencies.
The analysis of Table 2, Table 3, Table 4 and Table 5 shows that the analyzed databases have a very large number of deficiencies; in total, there were 220 major deficiencies in the analyzed databases, of which 47 were rated one, 30 were rated two, and 143 were rated three.
Further reflection on the results presented in Table 1, Table 2, Table 3 and Table 4 leads to the following findings:
  • When evaluating the databases based on methodology, deficiencies were found in all of the databases examined;
  • The identified deficiencies varied, some were minor and insignificant, others were very serious;
  • Only one of the investigated companies had no red deficiency, and only one deficiency in the yellow group showed that the system in this company was seriously monitored;
  • There was a large number of databases that require immediate repair actions (more than 77% of the examined databases);
  • On average, there were more than six serious deficiencies per database (to be exact, there were 6.2!!!).
Further review of the assessment results showed that Company D was not paying enough attention to the CMMS, i.e., it had not recognized the benefits that the system can provide.
These poor assessment results show that the CMMS in Company D was neglected both in the offices and on the ships.
Since it is the largest of the companies studied with a large number of vessels, these results could affect the objectivity of the entire research. In order to obtain the most objective picture, the results of the evaluation of Company D’s vessels were excluded from the final consideration.
After excluding Company D’s vessels from the analysis, the following picture emerges:
  • Minor or major deficiencies were found in all the databases examined;
  • There was a large number of databases where immediate repair actions were required (in 60% of the analyzed databases);
  • A percentage of 63% of all serious deficiencies concerned only four vessels;
  • The average number of serious deficiencies was only two;
  • Only one database had missing components (4%);
  • Only one database was found to lack an adequate maintenance plan (4%).
It can be concluded that more than 60% of all databases could be improved, 16% of them in more than one area. The results of this analysis show that only 1/3 of CMMS databases were in good condition. These poor results were not unexpected, because the only other information found about the condition of CMMS databases of ships declared 1/4 of the databases to be good [34].

3. CMMS Development Problem

A possible solution to the missing books problem was published in two articles [1,35], the first [1] showing the preparation of the methodology and the second [35] showing the creation of the maintenance plan.
Fault Tree Analysis (FTA) [36,37] is a widely used method for evaluating the reliability of systems [38], which is used either as static or dynamic. The method is also used to analyze fault causes, improve early fault detection, and improve fault diagnosis during engine operation by reducing false conclusions and inappropriate corrective actions [39]. In this part of the study, the method is used to analyze the turbocharger system of marine diesel engines to identify possible faults in the turbocharger system and determine areas (components) that should be serviced. In this study, the faults identified with the FTA analysis are simulated using the Wartsila-Transas 5000 engine room simulator on the propulsion system of the tanker LCC (Aframax) with the main engine MAN B&W 6560 MC-C [40]. The use of the E/R simulator together with the FTA simplifies the preparation of the fault list and allows its verification from different working aspects.
By combining these two tools, a comprehensive fault list of the turbocharger system of marine diesel engines is created and analyzed in detail. The article [1] once again shows that FTA is extremely useful and practical in analyzing system reliability, energy efficiency, and maintenance costs.
After making a comprehensive list of the faults of the turbocharger system of a marine diesel engine and analyzing what maintenance work needs to be done to avoid these faults, it was necessary to derive the maintenance schedule for the system from the fault list. Each fault from the FTA list is analyzed, and then appropriate preventive maintenance activities are assigned to prevent the occurrence of each fault, resulting in a detailed maintenance plan for the turbocharger system of the marine diesel engine. Several maintenance plans were prepared by the experiment participants (authors of the articles), and each author used his or her own (personal) experience in marine engineering to prepare the maintenance plan.
These plans were compared and a slight variation was found in the maintenance plans for different tasks. These differences are attributed to the different experiences and practices of the authors [41]. To verify the obtained results, the maintenance plans prepared by the authors using the FTA list were compared with the maintenance instructions for the turbocharger system of the marine diesel engine [42,43]. The comparison showed that these schedules differ slightly from the manufacturer’s maintenance recommendations, but the overall verdict is that they are very similar and the end goal is achieved in both cases.
The conclusion of this part and the contribution to the overall objective is to show that FTA combined with engineering experience can be a substitute for missing manufacturer’s maintenance recommendations when creating the CMMS database. Although the newly created maintenance plan is not the same as the manufacturer’s recommended plan, it is very close to it and is a good substitute for it.

4. Benefits of These Proposals

The first step in improving the entire CMMS system is a detailed review of the database and the data it contains using the Evaluation Methodology for Ship PMS (Figure 2).
This will uncover all the data needed for the improvement effort. This requires expertise, i.e., a good knowledge of the computer programs used and a good knowledge of seamanship, more specifically, marine engineering.
The evaluation methodology for a ship’s PMS [30] should be applied during the development of the CMMS database and during the use of the system to avoid deficiencies of the database and to allow proper use with all its benefits. They are relatively easy to calculate using the basic equation:
B = N s d N q × N v
where:
  • Nsd—The number of discovered deficiencies;
  • Nq—Total number of questions;
  • Nv—The number of analyzed vessels.
The application of the methodology will result in the following:
  • There will be 4% fewer databases with missing equipment, which will increase maintenance reliability and reduce corrective maintenance;
  • There will be 7% fewer databases with missing work orders, which will increase maintenance reliability and decrease corrective maintenance actions;
  • There will be 13% fewer databases of missing spare parts, which will increase inventory accuracy, resulting in financial savings and increased vessel safety;
  • More than 60% of the databases will have improved data and fewer discrepancies, giving the crew better insight into the system;
  • Overall costs will be significantly reduced as fewer repairs will need to be made and/or fewer emergency spare parts will need to be ordered.
In order to calculate the benefit of the second part of the research (missing books problem), it is necessary to determine how many books are still missing when the database is created. According to two database factories (companies that specialize in creating databases), this number varies. It depends on whether the ship is new or used, whether the data is in electronic or paper form, and where the ship was built, etc.
By studying all available databases according to [33] and calculating the number of these cases compared to the number of ship equipment, the estimated benefit of this part of the research will be the potential improvement of 4% of the databases (4% of the equipment will have a maintenance plan that will allow better maintenance of these systems).
The given value was calculated for newbuildings (all analyzed ships were taken as newbuildings), and the value of solving the problem of missing books for second hand ships remains open as a task for future analysis.

5. Conclusions

This paper presents two solutions to improve data quality in the CMMS database. The first and far more significant improvement proposal is the evaluation methodology of the ship’s PMS, which allows a clear evaluation of the data quality and the identification of areas in the database that can be improved in order to improve the overall maintenance process. The significance of this proposal is that, for the first time, a tool has been created to clearly assess whether the CMMS data is valid and whether the assessment results are the same or similar, even if different people perform the assessment. By incorporating the vessel PMS evaluation methodology into the design and daily use of the CMMS database, the potential benefits described in Section 3 can result in thousands of dollars in maintenance savings if maintenance is not properly adjusted. At the same time, the impact on the safety of the vessel, crew, cargo, and environment can be measured in extremely large amounts (millions or more) if maintenance of certain equipment is properly adjusted and/or performed. The side effect of applying the methodology and improving the quality of the data in the database is to demonstrate to the crew that the CMMS is an important system on board and that it receives the attention it deserves, which further motivates the crew to work with the system on a daily basis. An accurate calculation of the value of this proposed improvement is reflected in the expected improvement of up to 60% of all CMMS databases, including up to 16% in more than one area.
The second proposed improvement is seemingly insignificant, but it is very useful in the case of second-hand vessels, especially those built in failed shipyards or equipped with equipment manufactured by failed companies. The actual financial impact of this proposal is very difficult to calculate after the fact, since each of the possible events can be expressed differently. The benefits of solving the missing books problem calculated in this paper are small, but not insignificant. According to the calculations in this paper, this benefit amounts to 4% of the equipment that will benefit from this proposal, i.e., 4% fewer potential failures and 4% less probability of severe damage.
The calculation of the benefits from the application of these proposals has been made very conservatively, assuming lower values for improvements and for vessels that are purchased as newbuildings. Regardless of how the benefits of these two proposals are calculated, it is clear that both proposals will reduce deficiencies in more than 60% of the databases, improve vessel maintenance, and increase vessel safety.
The main problem with the proposed methods is their current status. Despite the great potential for improvement and the fact that they are publicly available, they are not widely used in practice. The only demonstrated use in practice are the companies that the team contacted personally and the companies that acted as test companies. The next steps the team should take are to analyze why the measures have not been expanded and what should be done to expand their use.

Author Contributions

Conceptualization, L.S. and N.R.; methodology, L.S., N.R. and T.S.; validation, Đ.D.; formal analysis, L.S. and T.S.; investigation, L.S.; writing—original draft preparation, L.S., writing—review and editing, L.S., N.R., T.S. and Đ.D.; supervision, N.R., T.S. and Đ.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Knežević, V.; Orović, J.; Stazić, L.; Čulin, J. Fault tree analysis and failure diagnosis of marine diesel engine turbocharger system. J. Mar. Sci. Eng. 2020, 8, 1004. [Google Scholar] [CrossRef]
  2. Liu, S.; Chen, H.; Shang, B.; Papanikolaou, A. Supporting Predictive Maintenance of a Ship by Analysis of Onboard Measurements. J. Mar. Sci. Eng. 2022, 10, 215. [Google Scholar] [CrossRef]
  3. Frangopoulos, C.A. Developments, Trends, and Challenges in Optimization of Ship Energy Systems. Appl. Sci. 2020, 10, 4639. [Google Scholar] [CrossRef]
  4. Munyensanga, P.; Widyanto, S.A.; Aziz, M.N.A.; Rusnaldy, P. Information management to improve the effectiveness of preventive maintenance activities with computerized maintenance management system at the intake system of circulating water pump. Procedia CIRP 2018, 78, 289–294. [Google Scholar] [CrossRef]
  5. Parida, A.; Kumar, U. Maintenance performance measurement (MPM): Issues and challenges. J. Qual. Maint. Eng. 2006, 12, 239–251. [Google Scholar] [CrossRef]
  6. Poór, P.; Šimon, M.; Karková, M. CMMS as an effective solution for company maintenance costs reduction. In Production Management and Engineering Sciences; Taylor & Francis Group: London, UK, 2015; pp. 241–246. ISBN 978-1-138-02856-2. [Google Scholar]
  7. Wienker, M.; Henderson, K.; Volkerts, J. The Computerized Maintenance Management System an Essential Tool for World Class Maintenance. Procedia Eng. 2016, 138, 413–420. [Google Scholar] [CrossRef] [Green Version]
  8. Eti, M.C.; Ogaji, S.O.T.; Probert, S.D. Reducing the cost of preventive maintenance (PM) through adopting a proactive reliability-focused culture. Appl. Energy 2006, 83, 1235–1248. [Google Scholar] [CrossRef] [Green Version]
  9. Hamilton, J. Early-Stage Transition to Predictive Maintenance: Using CMMS, IR Scans, and Vibration Analysis to Improve Uptime and Lower Maintenance Costs. Bachelor’s Thesis, Portland State University, Portland, OR, USA, 2015. [Google Scholar] [CrossRef] [Green Version]
  10. SpecTec. CMMS Presentation. 2020. Available online: https://www.spectec.net (accessed on 11 February 2023).
  11. Alleyne, P.; Rhoden, D.; Williams, D. Expert scheduling and planned maintenance systems. Trans. Inst. Mar. Eng. 1991, 103, 365. [Google Scholar]
  12. Cieri, A.M.; Elfont, M.M. Engineered approach to effective maintenance management. Nav. Eng. J. 1991, 103, 253–261. [Google Scholar] [CrossRef]
  13. Wireman, T. Computerized Maintenance Management Systems; Industrial Press Inc.: New York, NY, USA, 1994; ISBN 978-0831130541. [Google Scholar]
  14. Simion, D.; Purcărea, A.; Cotorcea, A.; Nicolae, F. Maintenance onboard ships using computer maintenance management system. Sci. Bull. “Mircea Cel Batran” Nav. Acad. 2020, 23, 134A–141A. [Google Scholar] [CrossRef]
  15. Cang, T.; Dung, V.A.; Thien, D.M.; Bich, V.N. Implementation of the Computerized Maintenance Management Systems (CMMS) for the Maritime Industry. In Proceedings of the World Congress on Engineering 2012, London, UK, 4–6 July 2012; International Association of Engineers: London, UK, 2010; Volume 2189, pp. 1103–1106. [Google Scholar]
  16. Gašpar, G.; Poljak, I.; Orović, J. Computerized planned maintenance system software models. Pomorstvo 2018, 32, 141–145. [Google Scholar] [CrossRef]
  17. Lazakis, I.; Turan, O.; Aksu, S. Increasing ship operational reliability through the implementation of a holistic maintenance management strategy. Ships Offshore Struct. 2010, 5, 337–357. [Google Scholar] [CrossRef]
  18. Mortimer, A. Planned maintenance, systems and usage. In Motor Ship; Mercator Media Ltd.: Fareham, UK, 2014. [Google Scholar]
  19. Davies, C.R. Computer-based planned maintenance programmes. Prop. Manag. 1990, 8, 40–60. [Google Scholar] [CrossRef]
  20. Kilkenny, M.F.; Kerin, M.R. Data quality: “Garbage in–garbage out”. Health Inf. Manag. J. 2018, 47, 103–105. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Tayi, G.K.; Ballou, D.P. Examining data quality. Commun. ACM 1998, 41, 54–57. [Google Scholar] [CrossRef]
  22. Batini, C.; Scannapieco, M. Data and Information Quality; Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar]
  23. Lee, Y.W.; Pipino, L.L.; Funk, J.D.; Wang, R.Y. Journey to Data Quality; The MIT Press: Cambridge, MA, USA, 2006. [Google Scholar]
  24. Marmo, R.; Nicolella, M.; Polverino, F.; Tibaut, A. A methodology for a performance information model to support facility management. Sustainability 2019, 11, 7007. [Google Scholar] [CrossRef] [Green Version]
  25. Pipino, L.L.; Lee, Y.W.; Wang, R.Y. Data Quality Assessment. Commun. ACM 2002, 45, 211–218. [Google Scholar] [CrossRef]
  26. Batini, C.; Cappiello, C.; Francalanci, C.; Maurino, A. Methodologies for data quality assessment and improvement. ACM Comput. Surv. 2009, 41, 1–52. [Google Scholar] [CrossRef] [Green Version]
  27. Ballou, D.; Wang, R.; Pazer, H.; Tayi, G.K. Modeling information manufacturing systems to determine information product quality. Manag. Sci. 1998, 44, 462–484. [Google Scholar] [CrossRef] [Green Version]
  28. Strigaro, D.; Cannata, M.; Antonovic, M. Boosting a Weather Monitoring System in Low Income Economies Using Open and Non-Conventional Systems: Data Quality Analysis. Sensors 2019, 19, 1185. [Google Scholar] [CrossRef] [Green Version]
  29. Jan, S.-S.; Tao, A.-L. Comprehensive Comparisons of Satellite Data, Signals, and Measurements between the BeiDou Navigation Satellite System and the Global Positioning System. Sensors 2016, 16, 689. [Google Scholar] [CrossRef] [Green Version]
  30. Stazić, L.; Komar, I.; Račić, N. Evaluation Methodology for Ship’s Planned Maintenance System Database. Trans. Marit. Sci. 2017, 6, 109–116. [Google Scholar] [CrossRef] [Green Version]
  31. Tipgos, M.A.; Trebby, J.P. Job-Related Stresses and Strains in Management Accounting. J. Appl. Bus. Res. 1987, 3, 8–14. [Google Scholar] [CrossRef]
  32. Stazić, L.; Stanivuk, T.; Mihanović, V. Testing of the evaluation methodology for Ship’s Planned Maintenance System Database. J. Appl. Eng. Sci. 2019, 17, 273–279. [Google Scholar] [CrossRef] [Green Version]
  33. Stazić, L.; Komar, I.; Mihanović, L.; Mišura, A. Shipowner’s impact on planned maintenance system database quality grades resemblance equalization. Trans. Marit. Sci. 2018, 7, 5–22. [Google Scholar] [CrossRef] [Green Version]
  34. Vučinić, B. MA–CAD, Maintenance Concept Adjustment and Design. Ph.D. Thesis, Faculty of Mechanical Engineering and Marine Technology, Delft, The Netherlands, 1994. [Google Scholar]
  35. Stazić, L.; Knežević, V.; Račić, N.; Orović, J. Fault Tree Analysis as A Replacement for Manufacturers’ Maintenance Instructions, Proceedings of the 2nd International Conference of Maritime Science & Technology Naše More, Dubrovnik, Croatia, 17–18 September 2021; University of Dubrovnik, Maritime Department: Dubrovnik, Croatia, 2021; pp. 325–331. ISBN 978-953-7153-60-1. [Google Scholar]
  36. Jishkariani, M. Fault Tree Analysis (FTA) For Energy Enterprises. 2020. Available online: https://www.researchgate.net/publication/341494947_Fault_Tree_Analysis_FTA_For_Energy_Enterprises (accessed on 18 July 2022).
  37. Boryczko, K.; Szpak, D.; Żywiec, J.; Tchórzewska-Cieślak, B. The Use of a Fault Tree Analysis (FTA) in the Operator Reliability Assessment of the Critical Infrastructure on the Example of Water Supply System. Energies 2022, 15, 4416. [Google Scholar] [CrossRef]
  38. Čepin, M.; Mavko, B. A dynamic fault tree. Reliab. Eng. Syst. Saf. 2002, 75, 83–91. [Google Scholar] [CrossRef]
  39. Sánchez-Beaskoetxea, J.; Basterretxea-Iribar, I.; Sotés, I.; de las Mercedes Machado, M.M. Human error in marine accidents: Is the crew normally to blame? Marit. Transp. Res. 2021, 2, 100016. [Google Scholar] [CrossRef]
  40. Wärtsilä Corporation. MAN B&W 6S60MC-C Diesel Engine—Tanker LCC (Aframax), Trainee Manual; Wärtsilä Corporation: Portsmouth, UK, 2018. [Google Scholar]
  41. Trevelyan, J. Reconstructing engineering from practice. Eng. Stud. 2010, 2, 175–195. [Google Scholar] [CrossRef]
  42. MAN Diesel SE. TCR 22–2–C1Operating Instructions; MAN Diesel SE: Augsburg, Germany, 2007. [Google Scholar]
  43. MAN Diesel & Turbo. TCR Turbocharger, Project Guide Book; MAN Diesel & Turbo SE: Augsburg, Germany, 2014. [Google Scholar]
Figure 1. DQA in practice, based on [25].
Figure 1. DQA in practice, based on [25].
Applsci 13 02731 g001
Figure 2. CMMS database evaluation process.
Figure 2. CMMS database evaluation process.
Applsci 13 02731 g002
Table 1. The questionnaire.
Table 1. The questionnaire.
GroupColorNo.Question
Machinery and equipmentR1Are all machines and equipment recorded in the database?
R2Are all items of equipment properly recorded and clearly identified according to their location on board and their marking?
R3Are all required machines divided into subcomponents (smaller subsystems) in a logical manner?
Y4Does any machine or equipment have a greater number of subcomponents than necessary?
Y5Are machines or equipment listed more than once in the database or do they have the same markings or names?
Y6Are the manufacturer, type, and serial number data entered for all relevant items?
G7Do all entries for equipment and machinery have the same style, abbreviations, and identifiers?
Jobs inside DB 8Is there a linked maintenance schedule for all equipment on DB according to the manufacturer’s recommendations?
R9Are the manufacturer’s recommendations organized by equipment, time periods, and company maintenance requirements?
R10Is all work required by company policy included in DB (e.g., SSM—Safety Management System)?
Y11Has all work based on manufacturer’s recommendations been modified based on company policy (if applicable)?
R12Is all work required by flag state regulations included in DB?
Y13Is all work required by the classification society included in DB?
R14Are there a number of smaller tasks that can be grouped together?
Special jobs and rulesR15Are fire alarm sensors included in DB along with the test plan?
Y16Is the alarm system and its test program entered in DB?
R17Is the PMS self-improvement program entered into DB, and is there a control mechanism for the PMS DB self-improvement program?
R18Is the critical equipment labeled in accordance with the company’s SMS?
DB jobs generalR19Are job descriptions clearly and unambiguously stated?
R20Are jobs created and grouped according to the multiplier principle?
G21Are all like jobs that originate from different sources synchronized?
Y22Are all similar jobs originating from different requirements (sources) merged?
Spare partsY23Are all required spare parts included in the database?
Y24Are the spare parts distributed to the correct equipment and machines?
R25Are all spare parts correctly identified, do they have sufficient data for ordering?
R26Is the company critical spare parts list inserted in the DB?
R27Do all spare parts have the same style, abbreviations, markings, etc.?
R28Are there spare parts that are entered more than once?
MiscellaneousG29Are all users entered in the DB, are all access rights correctly defined?
Y30Are there any other deficiencies in the computerized PMS database?
Table 2. Deficiencies discovered in companies A, B, and C.
Table 2. Deficiencies discovered in companies A, B, and C.
Minor Def.Major Deficiencies
Grade4321
ColorRYGRYGRYGRYG
A144----------
A233-31-2-12--
A37411--1-----
B112-11-------
B222-11-------
B3-2-11-------
B412-11-------
B512-11-------
C16621---1----
C276211-------
C355211--1----
C435141--1-1--
C566-11-------
C66621---1----
C755233----21-
C856111-------
Table 3. Deficiencies discovered in companies D and E.
Table 3. Deficiencies discovered in companies D and E.
Minor Def.Major Deficiencies
Grade4321
ColorRYGRYGRYGRYG
D1--131-23-112-
D2541---1--72-
D356-1--1--51-
D432123-2--72-
D554121-21-72-
D631----1--72-
D743-1--2--41-
D83511--11-61-
D9-1132-21-101-
D1055-1--1--41-
D1134----1--82-
D1242-1--1--72-
D1314----1--72-
D1454-11-1--51-
D15431---1--72-
D161-123-31-102-
D17731-1-11-72-
D1864-3--1--41-
D193514--21-61-
E1351-------1-
E2251-------1-
E3251-------1-
E4351-------1-
E5451-------1-
E6251-------1-
E7251-------1-
E8451-------1-
E9451-------1-
Table 4. Serious deficiencies in databases A, B and C.
Table 4. Serious deficiencies in databases A, B and C.
Grades
Database123Total
A23227
A311-2
B11--1
B21--1
B31--1
B41--1
B51--1
C11--1
C21--1
C31--1
C44-15
C51--1
C61--1
C73-25
C81--1
Table 5. Serious deficiencies in databases D and E.
Table 5. Serious deficiencies in databases D and E.
Grades
Database123Total
D1321116
D2-178
D31157
D422711
D522711
D6-178
D71247
D81168
D9321015
D101146
D11-189
D121179
D13-178
D141157
D15-178
D16231015
D17-178
D183148
D1942612
E1--11
E2--11
E3--11
E4--11
E5--11
E6--11
E7--11
E8--11
E9--11
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stazić, L.; Račić, N.; Stanivuk, T.; Dobrota, Đ. Determination of Benefits of the Application of CMMS Database Improvement Proposals. Appl. Sci. 2023, 13, 2731. https://doi.org/10.3390/app13042731

AMA Style

Stazić L, Račić N, Stanivuk T, Dobrota Đ. Determination of Benefits of the Application of CMMS Database Improvement Proposals. Applied Sciences. 2023; 13(4):2731. https://doi.org/10.3390/app13042731

Chicago/Turabian Style

Stazić, Ladislav, Nikola Račić, Tatjana Stanivuk, and Đorđe Dobrota. 2023. "Determination of Benefits of the Application of CMMS Database Improvement Proposals" Applied Sciences 13, no. 4: 2731. https://doi.org/10.3390/app13042731

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop