1. Introduction
It is the goal of global organizations and governments to establish effective healthcare systems that will culminate in healthy societies over time. Good health and wellbeing are one of the giant pillars of the universal declaration of the Sustainable Development Goals 2030 Agenda by the United Nations to expand healthcare coverage worldwide. Predictively, the world population will expand to 8.5 billion by 2030 and to 9.7 billion by 2050, a phenomenon which will likely increase the aged/elderly population from 1 out of 11 persons now to about 1 out of 6 persons by 2050 [1]. Although the need for healthcare services is expected to exponentially increase in the near future, basic healthcare services are not even accessible to the majority of the world population in the present time [2]. The Organisation for Economic Co-operation and Development admits that the current capacity of the global healthcare system is significantly undermined by the overriding impact of realistic healthcare needs due to insufficient investment [3]. The global experience of the COVID-19 pandemic and its unforgettable instantaneous impacts on human lives emphasize the wide gaps between effective healthcare systems and actual healthcare needs, undermining the sustainable provision of adequate healthcare services. In addition to the impact of the COVID-19 pandemic, changes in demography have also exacerbated the need for healthcare facilities in the case of Hong Kong.
With close to 7.5 million people, Hong Kong is currently within the top five most densely populated jurisdictions in the world [4]. This implies a growing need for healthcare, particularly regarding the expanding elderly demographic. Hong Kong has responded over the years by regularly expanding its base of healthcare facilities. The Hospital Authority, a statutory organization enacted in 1990 under the Hospital Authority Ordinance, has overseen and managed all public healthcare services in Hong Kong since December 1991. As of March 2023, the Hospital Authority manages 43 hospitals and institutions, 49 specialist out-patient clinics, and 74 general out-patient clinics with about 90,000 workforce members and over 30,000 beds. The records of the 2022/23 year show a total of 21.88 million attendances, outreach visits, and discharges of patients [5]. Additionally, 14 major private healthcare facilities are operating under the Hong Kong Private Hospitals Association to complement the efforts of the Hong Kong government in providing healthcare services to people. However, waiting times are naturally not favorable to patients due to the limitation of healthcare facilities and services.
Hong Kong’s elderly demographic is expected to surpass a quarter of the population (i.e., 26.4%) by the year 2036 [6]. To somewhat deal with this and other healthcare needs, the Hong Kong government is currently undertaking two continuous hospital development plans ranging from 2016 to 2036 estimated to be USD 64 billion in total value to enable the upgrade, expansion, and development of several healthcare facilities [7]. The first 10-year plan entails the construction of a new acute hospital, redevelopment or expansion of eleven existing hospitals, construction of three new community health centers, and construction of one new supporting services center. The second 10-year plan also covers works on nineteen HPs. Both plans are expected to provide fifteen thousand additional beds, more than 90 new operating theatres, and other facilities for addressing the forecasted service demand by 2036 [6,8]. Healthcare projects (HPs) entail the planning, construction, and operation of healthcare facilities and infrastructure for delivering diverse healthcare services. HPs are the most essential components of any effective healthcare system because they serve as the quintessential instrumentation for coordinating and integrating healthcare services to reach people [9]. Global governments and organizations are showing commitment by making significant investments into HPs. As of mid-2024, the noticeable HPs in the pipeline around the globe amount to USD 636.8 billion: North America accounts for USD 249.9 billion, western Europe stands at USD 138.3 billion, and north-east Asia amounts to USD 62.1 billion [10]. HPs have unique features and characteristics, including alignment with rapidly changing healthcare legislations, consistency with best standard value delivery, a dynamic and complex implementation process, large-scale design and planning qualifications, technological sophistication, competitive marketplace, numerous and changing implementation requirements, etc. [11,12]. Owing to the abovementioned issues, HPs are, therefore, among the most challenging projects to plan, construct, and operate. This touches on a lot of performance considerations to make HPs successful in the construction industry.
Performance evaluation is relevant in the planning, construction, and operation of HPs for assessing the effectiveness and efficiency of management efforts and identifying areas for potential improvement. Amid several available evaluation systems such as post-occupancy evaluation, balanced scorecard, and benchmarking, the key performance indicator (KPI) system is probably the most prominent in construction research, particularly for measuring different aspects of HPs [13,14,15,16,17,18,19,20,21]. Performance evaluation requires responsible persons to set objectives, identify relevant KPIs, collect and analyze performance data, and forward feedback to the appropriate authorities or persons [22].
There are a number of limitations to performance evaluation systems in HPs specifically. First, the several frameworks of KPIs proposed for evaluating different aspects and segments of HP performance do not necessarily have universal acceptance [23,24]. The development of performance evaluation systems depends on several reasons covering the underlying purpose, target users of results, nature of organization, and the up-to-date trends in the industry [25]. Additionally, the performance expectations of HPs may change with time based on other factors such as applicable legislations. Thus, the available fragmentary frameworks of KPIs require consolidation, updating, modification, and redefinition to best meet the needs of several organizations in the present industry. Second, numerous scholarly works have identified suitable KPIs for measuring HP performance in many instances [21,26,27]. However, there is limited work on how the proposed KPIs should be specifically evaluated in an objective and reliable manner, rather than depending on the subjective semantic interpretations of individual practitioners. Although knowledge on “the what to measure” (i.e., KPIs) is abundant in literature, limited understanding exists on “the how to measure” specifically, limiting the comprehensiveness and practicality of previously proposed models. Finally, past studies are generally limited by the independent phases of the HPs investigated [28,29,30]. The successive phases of HPs (e.g., planning, construction, operation, etc.) are interlinked in a complex way, such that performance issues experienced at an earlier phase could be transmitted to subsequent phases. Thus, the KPIs do not exist in isolation. KPIs that promote social sustainability could influence KPIs that promote economic sustainability and environmental sustainability. For example, neglecting to incorporate maintainability principles in project designs (i.e., for social sustainability and longevity of the facility) may create many maintenance problems, which could negatively affect environmental sustainability (i.e., more resources for maintenance), economic sustainability (i.e., high cost of maintenance), and circular economy (i.e., less intense use of the facility, early retirement of the facility) [31,32]. Moreover, there could be service delivery complications during facility operation, affecting functional capability and social sustainability. Therefore, failure to treat the whole life cycle of HPs as a seamless chain of value creating activities is detrimental to their success.
This study is a component of a broader research aimed at developing a computer-assisted project success index system to measure, monitor, control, enhance, and benchmark the performance of HPs. The present objectives are to (1) identify the most appropriate KPIs for measuring the life cycle success of HPs and (2) develop a composite HP success index (HPSI) for the construction industry. In separate studies, quantitative indicators will be developed as practical interpretations or definitions of the identified KPIs, and quantitative ranges will be established to grade different performance levels of KPIs according to the expectations of Delphi experts. Finally, the broad research will conclude with the consolidation of all findings into a computerized system to assist practitioners in evaluating, monitoring, benchmarking, and improving HP performance in Hong Kong. Theoretically, the study contributes to research efforts toward attaining perfect harmony on the pragmatic KPIs for measuring performance across the life cycle of HPs. The objectivity and practicality of outcomes will enhance the understanding of several organizations and practitioners on what constitutes a successful HP, in terms of effectiveness, efficiency, efficacy, and experience, helping owners and practitioners to design, construct, operate, and manage more successful HPs in Hong Kong. The paper is structured as follows. Following this introduction (Section 1), the research methods are presented (Section 2). Section 3 presents the results, followed by a discussion of the results (Section 4). Section 5 shows an integration of the various underlying KPIs and then a demonstration of the model application (Section 6). Finally, Section 7 presents the conclusion and recommendations for further research.
2. Research Methodology
A blended philosophical approach was employed, combining interpretivism, to develop a thorough quantitative data collection tool, and positivism, to initially refine this tool through a pilot survey before a cross-sectional collection of data via a questionnaire survey for later analysis [32]. The literature review utilized interpretivism to pinpoint the different KPIs essential for hospital projects. Then, the literature’s findings on KPIs were deployed for a questionnaire design. Subsequently, a postpositivist philosophy was applied through a pilot survey of experts to evaluate the questionnaire on the KPIs, which were further refined by rephrasing, removing, or adding additional KPIs. The refined questionnaire was used for quantitative data collection. The various steps as mentioned are detailed subsequently.
2.1. Systematic Literature Search
A systematic review was carried out by searching, screening, and synthesizing pertinent performance or success studies on HPs. The synonymous terms “healthcare project, healthcare center, health center, healthcare facility, hospital, clinic, infirmary, sanatorium, medical center, medical facility, convalescent home, and convalescent facility”, “success, failure, performance, KPI, benchmark, efficiency, and effectiveness”, and “building project, construction project, infrastructure project, engineering project, and construction industry” were combined and searched in the Scopus, Web of Science, and Google Scholar databases. In the Scopus database, the Boolean string of synonymous terms was searched in the title, abstract, and keyword domains of publications and returned 291 results. The same Boolean string was searched in the Web of Science database in the title, abstract, and keyword domains of publications, but this could only extract a handful of results. To obtain better and manageable results, the search was reconducted in all fields/domains of publications and returned 623 documents. The aforementioned results were complemented by searching different combinations of synonymous terms in the search pane of Google Scholar. The top twenty (20) publications generated and “sorted by relevance” were selected from the list of results. At the end, 419 unique publications were compiled from the Google Scholar search. The combined results of 1333 publications were obtained from Scopus, Web of Science, and Google Scholar searches conducted around mid-March of 2023 (Table 1). These publications were reduced to 806 in number upon eliminating duplicates. The titles and abstracts of the remaining publications were carefully reviewed and unrelated publications were also excluded. The unrelated publications focused on other fields and non-performance related subjects, e.g., effectiveness of hospital care delivery. After this screening process, 85 publications were selected for further consideration. Now, key areas of these selected publications (including introduction, results, discussion, and conclusions) were critically reviewed. Publications that did not identify, propose, or evaluate some success criteria, indicators, measures, parameters, or yardsticks at some life cycle phases (e.g., planning, construction, or operation) of HPs were regarded as less relevant and filtered out of the sample. At the end, 39 publications that centered on success or performance in the context of HPs were adjudged suitable so that the findings would be appropriate for empirical research and application in the HP sector. A detailed review of the final sample and consolidation and synthesis of the findings led to the identification and categorization of KPIs. The research phases, processes, methods, and outcomes of the study are shown in Figure 1.
Healthcare Project Success
Success criteria or KPIs are the performance dimensions that are measured to determine the success level of HPs. Studies have proposed different sets of KPIs for evaluating diverse aspects and phases of HPs. The review follows different categorizations of KPIs at the planning and construction phases, as well as the post-construction phase of HPs. In the planning and construction phases, the most popular KPIs are time performance, cost performance, and quality performance [17,18,33,34,35,36,37,38]. Historically, these three KPIs have developed into the basic criteria for measuring success in the project management field in general. Other classical criteria have been introduced to deepen the definition of success, including safety performance, environmental performance, productivity, and resource management [11,20,39]. HPs are naturally complex and uncertain, and they are planned and executed amidst several unpredictable factors. These unavoidable experiences do create both positive and negative risks for projects. The related KPIs entail planning and risk management effectiveness, rework scope, and occurrence and magnitude of litigation, claim, and change [40,41,42,43].
Project participants play important roles in designing, planning, and constructing HPs by contributing expertise and experiences and receiving corresponding rewards in the process. To verify that HPs engage the suitable participants to make the required inputs and obtain fulfilling outcomes, the criteria comprising human resource management, participant profitability, client/participant satisfaction, and participant professionalism, competency, and reputation have been proposed in the literature [39,42,44]. Obviously, the absence of effective networks of relationships creates many problems for HPs that require a joint effort and a smooth working environment to enable participants to deliver on complex and complicated requirements. Accordingly, KPIs, including trust and respect, teamwork and collaboration, communication effectiveness, conflict/dispute occurrence and magnitude, harmonious working relationships, and long-term business relationships, have been developed to underlie the measurement and improvement of such relationship-based performance areas [18,30,39]. The implementation of HPs undergoes regular changes because of the rapid update of technology and binding legislations [11,12]. As such, HPs ought to apply continual improvement methodologies to align with contemporary developments in the sector. The associated success criteria are learning and development, innovation and improvement, and building code adherence [11,39,45].
In the post-construction phase, a key consideration is how the constructed facilities can meet the requirements of end-users and the objectives of the owners. The derivation of fulfilment from the operation of facilities is the “ultimate verdict” on the fact of HPs reaching their development purposes. The KPIs pertaining to facility usage and satisfaction comprise long-term community/societal benefits, service lifespan, flexibility and adaptability, commercial profitability/value, service performance, maintenance interruptions to operations, stakeholder/end-user satisfaction, and functional suitability, capacity, and utilization [24,46,47,48,49]. In the course of the operation of healthcare facilities, there arises the need to regularly restore the condition of facilities to commendable standards. The restoration works could be about the architecture (e.g., painting), engineering (e.g., fixing structural defects), installations (e.g., replacing systems and equipment), and environment (e.g., upgrading therapeutic experience). The effectiveness of facility management practices is likely to contribute to the quality of the healing process. Series of KPIs have been suggested to evaluate the facility management performance of HPs including maintenance effectiveness, maintenance efficiency, operation and maintenance (O&M) expenditure, O&M safety performance, O&M statutory compliance, maintenance time performance, facility condition, and spare parts management [13,23,26,48,50,51,52].
The organizations operating and maintaining healthcare facilities play significant roles in the overall project success across the life cycle. The higher these organizations perform, the more healthcare facilities are in good shape to deliver required healthcare services, and vice versa. The success criteria in this category comprise O&M policy/guideline deployment, O&M information management/sharing, and O&M organization/management effectiveness [15,27,51,53]. Sustainability is a critical topic in the global construction industry because the built environment makes notable negative impacts on the environment through waste emissions and resource utilization, whereas the same suffers from terrible disasters such as floods, earthquakes, fire outbreaks, wind outbreaks, etc. Scholarly experts believe that HPs should be executed in a way that minimizes potential curses and maximizes the resulting blessings over the life cycle. The sustainability-related KPIs are water and waste management, energy utilization, resilience and sustainability, and current replacement value of facility [14,16,28,54,55]. The surrounding environment is as important as the actual healthcare facilities in delivering result-oriented healthcare service due to its impacts on people’s perception and adjustment. For instance, a noisy surrounding environment can hinder or prolong the healing process of patients. KPIs such as site/location optimization, visual appearance and appeal, facility integration into locality, and healthcare culture/image embeddedness have been established to assess the quality of the surrounding environment of healthcare facilities [14,19,21,29,45].
In summary, 54 KPIs were identified from global scholarly works on HP success, performance, or evaluation. By broadly categorizing them, one set of 27 KPIs is suitable for measuring success at the planning and construction phases and another set of 27 KPIs is appropriate for evaluating success at the post-construction phase of HPs. The KPIs at the planning and construction phases were sub-categorized into classical measures, uncertainty and risk, project relationships, project participants, and project improvements groups. Also, the KPIs relevant for the post-construction phase were sub-categorized into facility usage and satisfaction, facility management, organization, sustainability, and surrounding environment groups. This success evaluation framework is comprehensive and versatile because the selected KPIs have been applied to HPs in different jurisdictions including Hong Kong, Ghana, Malaysia, Indonesia, Australia, USA, China, Singapore, Israel, Canada, Spain, UK, and others. Practically, the success evaluation framework will support the decision-making of organizations across the life cycle of HPs by underlying the measurement and improvement of relevant performance areas.
2.2. Data Collection Technique
Presently, there is limited historical and empirical information on using KPIs to assess and monitor the success of HPs over the lifecycle. It is necessary to adopt an approach that explores factual data from the rich expertise and experiences of qualified respondents. Hence, the Delphi method is the most appropriate choice above other methods such as staticised groups, the nominal group technique (NGT), and focus groups [56,57]. This method solves complicated problems using experts’ opinions, self-validates outcomes through consecutive rounds, eliminates the bias effect and pressure to conform, enhances confidentiality through avoidance of direct communication among experts, preserves heterogeneity of experts for valid outcomes, and allows objective analysis using different statistical techniques [56,57,58]. In terms of limitations, the Delphi method may not be appropriate for empirical problem solving with a small budget and limited resources, a short time requirement, large sample sizes of respondents, less experienced and qualified respondents, and high attrition of respondents over time. Nevertheless, the Delphi method has gained popularity in complicated construction engineering and management (CEM) research, including sustainable development [59], team integration [60], procurement selection [61], performance evaluation [62,63], and risk assessment and allocation [57]. The process followed in this research study is presented in Figure 1.
2.3. Design and Pilot Testing of Survey Instruments
The KPIs (with definitions) identified from the literature were used to design templates of questionnaires for the Delphi rounds in line with Yeung et al. [62]. The initially prepared questionnaire templates were piloted on two international academics and one local practitioner with experience in HP development. The comments received were considered in revising and finalizing the questionnaires.
2.4. Data Collection Process
The main steps involved in the Delphi process include selecting pre-defined experts, setting the number of survey rounds, and structuring the questionnaires for every round [59]. Decisions on the number of survey rounds depend on the targeted convergence level and improvement of accuracy [56,64] to mitigate participatory fatigue, time and resource constraints, and attrition rate [65]. This study used four survey rounds to obtain information from the experts in line with the CEM norm of two to six rounds [66].
The scale used for the survey was a five-point Likert scale described as 1 = least important to 5 = most important. The scale was chosen because of its relative brevity in collecting responses and its suitability for evaluating unipolar dimensions [60,62,63]. Given the expertise requirement, it is believed that the respondents understood the scale labels and assigned scores to the KPIs appropriately.
To exercise proper control over the Delphi process, the experts were broadly educated on what the study was about, the expected benefits, and their required commitment [57]. Again, the experts were kept in close contact, communication with experts was clear and effective, survey instruments were designed to be simple with a completion duration of about 20 min, surveys were administered in person and by email, and follow-ups were conducted through email and personal visits before the completion deadlines [57,61,62]. These measures somewhat sustained the interest of experts, minimized attrition, and boosted response rates and promptness [56].
2.5. Panel Experts’ Selection
The authenticity of a Delphi study depends greatly on how researchers select qualified experts carefully and objectively [61,64]. Experts should be willing and readily available to review previous opinions so that the consensus level can be boosted [64]. Purposive and snowball sampling approaches were adopted to select qualified experts in this study [57]. Initially, formal invitation letters were delivered to targeted organizations that are involved in HP development, e.g., Hospital Authority, Architectural Services Department, and construction firms. The recipients were requested to nominate suitable experts who work within or with the organizations based on the set criteria [59,61,62]. The snowball sampling was complementarily used to opportunistically request the identified experts to recommend other known experts [57]. Apart from their roles in the organizations, the experts were required to meet the following developed criteria to be eligible for the panel [61,62,66]:
Knowledge and in-depth understanding of the planning, construction, and/or operation of HPs;
Recent hands-on experience in planning, constructing, and/or operating HPs; and
Played leading roles in the construction industry.
Based on availability and readiness, 19 experts performing significant roles in their organizations were qualified to join the panel and serve as Delphi respondents. This is adequate and in conformance with CEM research because typical panel sizes range from 3 to 93 experts [66]. The background information revealed that the responses represented a balanced view of experienced construction professionals and key stakeholders involved in HP implementation and operation (Table 2). Proper control was exercised over the Delphi process to extract substantial information from the experts within time and resource constraints.
2.6. Formats of the Delphi Survey Rounds
The Round 1 questionnaire was issued to the 19 qualified experts through email around mid-May 2023. The experts were requested to select a maximum of 6 KPIs separately from the planning and construction phase and post-construction phase perceived as being the most representative for measuring HP success. They were encouraged to inclusively suggest and select new KPIs that were not already covered in the checklist and yet were applicable in the Hong Kong context. Follow-ups were conducted until all experts returned the completed questionnaires by the end of July 2023.
The Round 2 questionnaire was issued to panelists in early August 2023. The essence of this round was for experts to affirm or change the selection of KPIs from Round 1. The consolidated feedback (i.e., frequency percentages of KPIs) from Round 1 was provided for experts’ reference. Necessary follow-ups were conducted on experts who could not complete the questionnaire early. By mid-September 2023, all 19 experts had returned duly filled questionnaires.
The Round 3 questionnaire was issued to the panel experts around mid-September 2023. The experts were asked to rate the importance levels of the ten shortlisted KPIs by using the five-point Likert scale. The consolidated feedback from Rounds 1 and 2 was provided for experts’ reference. Upon numerous follow-ups, only 15 questionnaires were completed and returned by late October 2023. Four experts withdrew due to commitments and workload.
In Round 4, the questionnaire was issued to the remaining experts, and they were given the opportunity to confirm or change their previous ratings of the ten shortlisted KPIs in light of the feedback information provided. The survey duration extended across November 2023, allowing the 15 remaining experts to make contributions upon several follow-ups. Thus, the response rate was 79% minimum across the four rounds. This is comparable to Yeung et al. [62] and Yeung et al. [63] with response rates of 79.5% and 35.56%, respectively. Questionnaires for the four survey rounds are attached as Supplementary Materials.
2.7. Analysis Methods
IBM SPSS 20.0 and Excel 2021 were engaged to perform statistical analysis on the survey data including Cronbach’s reliability analysis (α), Kendall’s coefficient of concordance (W), frequency analysis, mean score ranking (MS), factor analysis (FA), and fuzzy synthetic evaluation (FSE).
2.7.1. Frequency Analysis
The proportions of experts selecting the KPIs formed the basis for computing frequency percentages and selecting appropriate KPIs. Only KPIs preferred by at least 50% of the panelists were considered significant and shortlisted [61]. This approach is reasonable because the essence of the Delphi process is to ensure adequate consistency among the experts’ solutions or perceptions.
2.7.2. Kendall’s (W) Analysis
Kendall’s (W) was computed to statistically test the significance of the null hypothesis that experts’ ratings of KPIs were totally unrelated. An agreement level of 0 means that the experts’ ratings are totally unrelated, whereas an agreement level of 1 means that the experts’ ratings are completely identical [66].
2.7.3. Mean Score Ranking
MS is a commonly used technique in construction management research to establish the relative importance or criticality of factors [67]. In this study, it was adopted to rank the shortlisted KPIs for evaluating HP success based on importance levels.
2.7.4. Factor Analysis
Given the list of interrelated KPIs, a FA was performed to reduce them into manageable KPI components. Specifically, the principal component factor analysis (PCFA), with varimax rotation and Kaiser normalization, was used to extract the factors [68]. Prior to the PCFA, statistical tests including the reliability of dataset, correlation matrix, Bartlett’s test of sphericity, and the Kaiser–Meyer–Olkin (KMO) sampling adequacy measure were computed to check on the appropriateness of the factor model [67]. An eigenvalue benchmark of 1.0 was used to determine the principal factors to retain. Moreover, to adequately represent any significant relationships among the extracted components, only factor loadings that were greater than 0.5 were considered [69].
2.7.5. Fuzzy Synthetic Evaluation (FSE)
FSE is an aspect of fuzzy set theory (FST) that was introduced by Zadeh [70] “for representing and manipulating ‘fuzzy’ terms… [and] uses degrees of membership in sets rather than strict true/false membership” [71], p. 494. With this modeling technique, multi-evaluations and multi-attributes could be quantified appropriately [72]. Evaluating HP success is a multi-criteria decision-making process because there may be many decision makers, and uncertainty, imprecision, and incomplete information surround the decision-making process [73]. Given the unavoidable linguistic terms such as poor, good, and excellent performance in the fuzzy environment, FSE could be utilized to reach credible decisions from vague facts by defining them linguistically [74]. FSE has diverse applications in construction research, including risk evaluation [66], critical success factors evaluation [75], and PPP implementation evaluation [76].
FSE was used to model the role of KPIs in evaluating HP success. The fuzzy modeling method is better and preferred to other probabilistic modeling methods in terms of practicality and complexity of the algorithms [75]. For instance, when compared to the normal weighted method, FSE is appropriate because it can better objectify and handle the subjective judgment prevalent in the natural human thinking process [77]. Additionally, the FSE technique has been used for modeling in research where sample sizes are relatively small, i.e., fewer than ten sample points [78,79]. There is no universally agreed upon sample size criterion for conducting FSE. Hence, coupled with the self-validating Delphi process, the FSE technique is an appropriate choice for this study that is based on 15 expert responses. The steps below were followed for the FSE modeling [67]:
-
1.. Establish the basic criteria set. ,}; n represents the number of criteria.
-
2.. Label the set of grade choices as . The set of grade choices represents the points on the scale of measurement. Thus, the five-point scale is represented as: = least important, = fairly important, = important, = very important, and = most important.
-
3.. Compute the weighting for each criterion or factor component. By using the survey results, the formula for computing the weighting (W) is given by:
(1)
where = the weighting, = mean score of a specific criterion or factor component, and = sum of corresponding mean ratings. -
4.. Apply the fuzzy evaluation matrix to each factor component. The evaluation matrix is represented as , where is the extent to which choice satisfies the criterion .
-
5.. Derive the final FSE results from the weighting vector and fuzzy evaluation matrix using the formula below:
(2)
where D = the final FSE matrix and = the fuzzy composite operator. -
6.. The final FSE matrix is normalized and the HPSI for a specific factor component is computed using the following formula:
(3)
3. Results
3.1. Selecting the Most Relevant KPIs: Delphi Survey Rounds 1 and 2
Table 3 presents the selection prioritization of KPIs among the panel experts. Only the KPIs selected by the majority of the panelists (i.e., meeting the 50% shortlisting criterion) were considered significant for further consideration. The ranking follows the percentage of experts that selected each KPI. In both rounds, the traditional iron triangle of construction quality performance, construction time performance, and construction cost performance were the significant KPIs for evaluating HP success at the planning and construction phase together with construction safety performance and innovation and improvement. Rounds 1 and 2 results both suggest that stakeholder/end-user satisfaction, functional suitability, maintenance effectiveness and efficiency, and functional capacity and utilization are the significant KPIs for measuring HP success at the post-construction phase, in addition to flexibility and adaptability of facility found only in Round 2. Meanwhile, a new factor “* Defects rectifications and improvement extent owing to design afterthought” was suggested but could not rank high enough.
3.2. Rating the Shortlisted KPIs: Delphi Survey Rounds 3 and 4
The ranking of the ten shortlisted KPIs based on mean ratings is indicated in Table 4. The top-ranked KPIs of HP success were construction quality performance (R1 rank = 1; R2 rank = 1), construction safety performance (R1 rank = 3; R2 rank = 2), and stakeholder/end-user satisfaction (R1 rank = 1; R2 rank = 3). Largely, it is observable that the mean scores of the top-ranked KPIs improved in Round 4, indicating that the experts reconsidered these KPIs to be extremely significant for the purpose. Additionally, the high ranking of KPIs from both the planning and construction phase and post-construction phase manifests the relevance of the entire lifecycle in comprehensively assessing HP success. This makes sense because HPs are social projects in nature and their true values are witnessed during operation to improve the health and sustain the lives of global citizens, e.g., global governments were extremely dependent on healthcare facilities to mitigate the impact of the COVID-19 pandemic. Although all KPIs are important, focusing on these ten shortlisted KPIs could help benchmark, track, and improve HP success significantly in Hong Kong.
The Kendall’s (W) values obtained for the panel in Round 3 and Round 4 were 0.395 and 0.594, respectively, and both were significant at the 1% statistical level (Table 4). Accordingly, the null hypothesis that no significant agreement occurs among experts’ ratings was not supported. The results are considerable when compared to Yeung et al. [63], who obtained Kendall’s (W) values of 0.123 and 0.253 in similar Rounds 3 and 4, respectively. Although the agreement level of experts was likely to increase further from the initial 50.38% with extended survey rounds, this study was limited to four rounds to reduce attrition rate and boost response quality.
3.3. Identification of KPI Groupings for Healthcare Project Success
The 10 shortlisted KPIs were grouped by using the PCFA technique. Statistical tests were conducted on the data prior to the PCFA for appropriateness checks. The Cronbach’s (α) value obtained for the 10 selected KPIs was 0.858, exceeding the recommended 0.70 benchmark [80]. This shows that the responses of the panel experts are uniform and consistent, and the scale adopted is reliable. Also, the correlation matrix reveals strong relationships among the 10 selected KPIs, as most of the partial correlation coefficients were above 0.30 [68]. The sampling adequacy was determined by the KMO statistic. The KMO statistic returned an acceptable value of 0.589, which is greater than the recommended 0.50 level [68]. Hence, the sample was considered adequate for satisfactory PCFA. Lastly, the Bartlett’s test of sphericity resulted in a significant (X2) value of 95.87 (p 0.01). It can be inferred that the correlation matrix is not an identity matrix [68]. All the tests conducted do confirm that PCFA was appropriate for this study.
The principal component extraction and varimax rotation options were used to generate a three-factor solution. The three factors had eigenvalues of at least 1.0 and 76.25% of total variance explained, and all item loadings had absolute values above 0.50. The resultant three-factor model was an adequate representation of the dataset. The factor groupings were subjectively labeled as follows (Table 5):
KPIG 1—Project prosecution performance;
KPIG 2—Project purpose performance; and
KPIG 3—Project people performance.
These three groupings are believed to sufficiently explain and evaluate HP success.
3.4. Deriving the HPSI for Each KPIG of Healthcare Projects
To generate the HPSIs for the respective KPIGs, two levels were established prior to using the FSE technique for the modeling. The first level was the groups (KPIGs) and the second level was the individual factors (KPIs). Overall, three KPIGs and ten KPIs were established for the modeling process. The FSE procedure for evaluating HP success is demonstrated in the subsequent sections.
Step 1: Calculate the weightings of KPIs and KPIGs
The appropriate weightings of the KPIs and KPIGs were calculated with a formula (Equation (1)) by using the mean scores from Table 4. An illustration of the computations from Table 5 is demonstrated subsequently. For instance, the KPIG 1 (project prosecution performance) was made up of six KPIs with a total mean score of 22.47. Hence, the appropriate weighting for construction time performance (KPI5) was computed as:
Accordingly, the appropriate weightings of all the other KPIs and the KPIGs for construction projects were calculated (see Table 5).
Step 2: Establish the membership functions for the KPIs and KPIGs
In applying the FSE technique, two levels of membership functions (MFs) were determined from level 2 to level 1. The fuzzy MF, which shows the level to which an element belongs to a fuzzy set, usually ranges from 0 to 1. While a value of 1 shows full membership, 0 represents no membership of the corresponding element in the fuzzy set [77]. In determining the MFs of the KPIGs, the MFs of the KPIs were first established, like the procedure in computing the weightings. The MF of a KPI is established based on the percentages of respondents who selected the respective five defined grades, i.e., to . Given the example of “construction quality performance” (KPI1), 13% and 87% of panel experts rated it very important and most important, respectively. Hence, the MF for KPI1 was expressed as:
(4)
In a simpler format, the MF of KPI1 may be expressed as . The MFs of all KPIs were expressed following the procedure already described (Table 5). At level 1, the MFs of the KPIs formed the basis for computing the MFs of the KPIGs by using Equation (2). For instance, the MF of the KPIG 3 (project people performance) was expressed in computation form as:
Similarly, the MFs of KPIGs 1 and 2 were computed and all are presented in Table 5.
The final step was to derive the HPSI for each of the KPIGs. The MFs of the KPIGs at level 1 were used to derive the respective HPSIs with the help of Equation (3). The computations of the HPSIs for KPIG 1, KPIG 2, and KPIG 3 are illustrated below.
Step 3: Developing an overall HPSI model
Since the KPIGs were not significantly correlated with one another, a linear and additive model was adopted to develop the composite HPSI for evaluating HP success (Yeung et al., 2009 [63]). To formulate the linear and additive model, all the KPIGs were first normalized. This is logical and valid because practitioners can easily compare the relative activities among the variables forming the linear equation. Practitioners can focus relative attention on the components of the model to improve HP success predictably. Moreover, this provides flexibility for practitioners in choosing the most fitting measurement scales for accurate evaluation [60,64]. The derivations of the coefficients are demonstrated subsequently (Equation (5)):
(5)
In essence, the composite HPSI was expressed in the linear equation as Equation (6):
(6)
4. Discussion of Results
The HP success evaluation model (Equation (6)) reveals that “project people performance” had the highest coefficient (0.360) and was closely followed by “project purpose performance” (0.353) and “project prosecution performance” (0.287) (Figure 2). The ranking of the categories of KPIs, as revealed in Figure 2, shows a progression from short-term or fundamental KPI categories to long-term or ultimate KPI categories. The indices were combined to produce a linear model. The linear model offers a more objective and reliable approach that will aid practitioners to evaluate and compare HP success levels. Practitioners can further benchmark, monitor, and improve HP success levels predictably.
4.1. Project Prosecution Performance (KPIG 1)
The KPIG 1 obtained 43.91% of the total variance explained and all the factor loadings ranged from 0.724 to 0.895 in the PCFA. It was assigned the smallest coefficient (0.287) in the linear model upon obtaining an HPSI of 3.80. Its underlying KPIs, in descending order, included ‘construction time performance’, ‘construction cost performance’, ‘maintenance effectiveness and efficiency’, ‘functional capacity and utilization’, ‘innovation and improvement’, and ‘flexibility and adaptability of facility’. These KPIs are mainly related to the processes and actions required to implement and operate HPs successfully. For instance, ‘construction time performance’ and ‘construction cost performance’ are KPIs for assessing whether the time and cost of project execution are within or exceeding schedule and budget, respectively. Such assessments are mostly conducted quantitatively through variance analyses. Cost variance, for instance, is the difference between the earned value amount and the cumulative actual costs of a project [81], p. 258. This metric could also be expressed as an index referred to as cost performance index. The index is the ratio of the earned value amount to the cumulative actual cost of a project. Regarding time or schedule variance, it is the difference between the earned value and the planned value. Similarly, the time variance can be expressed as the schedule performance index, which is the ratio of the earned value to the planned value [81], p. 259. These variance analyses are key to progress the performance assessment of a project, and they could be used to assess project performance regarding the triple bottom line of sustainability, specifically economic sustainability. The other KPIs, such as ‘maintenance effectiveness and efficiency’, ‘functional capacity and utilization’, ‘innovation and improvement’, and ‘flexibility and adaptability of facility’, are suitable for assessing the project before or during its use. Tushar et al. [82] identified maintenance effectiveness as an important factor that can enhance the service quality of hospitals, ensure affordable and reliable service, and optimal equipment functionality. Similarly, Ebekozien [83], p. 32, stated that “maintenance is a massive investment if functionality and quality are to be sustained”. Fotovatfard and Heravi [84] emphasized the impact of maintenance on saving energy. They concluded that the most effective maintenance which leads to energy saving is condition-based maintenance. Condition-based maintenance entails replacing outdated equipment to improve functionality and to reduce maintenance costs. Condition-based maintenance could be achieved through inspection of equipment at fixed time intervals.
Quantitative metrics could be deployed for assessing these indicators. For example, ‘flexibility and adaptability of facility’ can be measured by assessing how possible it is to adapt the facility and its installation systems easily to accommodate additional demands from the end-users. This possibility can be expressed using a Likert scale. Similarly, ‘maintenance effectiveness and efficiency’ can be assessed based on the frequency or quality of maintenance activities, e.g., zero, less, or more maintenance backlog (expressed in a Likert scale). Regarding the triple bottom line of sustainability, these indicators are essential for assessing the level of social sustainability attainment. Additionally, they could also serve as indicators for environmental sustainability assessments. For instance, ‘flexibility and adaptability’ assesses how easily a healthcare facility can adapt to accommodate additional demands from the users. This could prevent building appendages and ensure a longer use of the facility, thus promoting circular economy. Moreover, with the embracing of environmental sustainability as the de facto standard in the healthcare sector [85], ‘innovation and improvement’ is an essential KPI to assess HPs thereof, although this KPI is a multifaceted concept. Thus, innovation regarding reuse, recycling, and reduction of resource consumption is an essential KPI for assessing environmental sustainability performance of HPs. It also entails assessments of carbon emissions to track greenhouse gas emissions from HPs. ‘Innovation and improvement’ could be assessed by the number of environmental excellence awards received by the HP [85].
4.2. Project Purpose Performance (KPIG 2)
The KPIG 2 explained about 17.99% of the total variance and the two item loadings were −0.802 and 0.856. It had a higher HPSI of 4.68 and a corresponding linear equation coefficient of 0.353. The underlying KPIs included ‘construction quality performance’ and ‘functional suitability’. This set of KPIs relate to how the implementation of HPs meets the defined purpose and objectives. ‘Construction quality performance’ can be measured as the cost of rectifying major defects or nonconformances over the total project cost. This metric can be expressed percentagewise by multiplying it by 100. Based on the estimated percentage value, a Likert scale (i.e., poor performance expectation, average performance expectation, good performance expectation, very good performance, and excellent performance) could be deployed to rate the ‘construction quality performance’ of a project. To improve this KPI, training of the workers to enhance their expertise and compliance of safety issues, purchasing and using of the right materials, and equipment are all necessary to contribute to the required level of quality. These activities have cost implications and are, therefore, referred to as the cost of conformance to quality [81]. However, cost of nonconformance to quality or cost of failure is the associated cost for not satisfying the quality expectations of the project. Such a cost could result from not using the right materials for construction or from a lack of adequate training of the workforce [81]. Regarding the ‘functional suitability’, it can be measured as the cost of modifications of the facilities (i.e., facilities include building, components, installations, machinery, systems, etc.) to meet relevant functional requirements as part of the current plan (expressed in percentage). Likewise, a Likert scale (i.e., poor performance expectation, average performance expectation, good performance expectation, very good performance, and excellent performance) could be utilized to rate the ‘functional suitability’ of a facility. Both ‘construction quality performance’ and ‘functional suitability’ are suitable for achieving social sustainability, as these KPIs contribute to wellbeing and satisfaction. As such, they can also be assessed on the level of satisfaction (i.e., very satisfied, satisfied, neutral, not satisfied, very dissatisfied) using a Likert scale [86]. ‘Construction quality performance’ leads to economic sustainability, since quality performance prevents cost implications of rework due to failure or nonconformance.
4.3. Project People Performance (KPIG 3)
The KPIG 3 accounted for 14.36% of the total variance explained, with two KPIs having factor loadings of −0.717 and 0.820, respectively. Being the highest ranked, the HPSI for the KPIG 3 was 4.77 and the associated coefficient in the linear model was 0.360. The underlying KPIs entailed ‘stakeholder/end-user satisfaction’ and ‘construction safety performance’. This group of KPIs concerns how the implementation and operation of projects bring best experiences to stakeholders. ‘Stakeholder/end-user satisfaction’ can be measured as the ‘usable floor area of facility spaces categorized as satisfactory regarding amenity and comfort engineering over the total usable floor area of a facility’. This could be expressed percentagewise and be rated using a Likert scale. Regarding ‘construction safety performance’, it can be measured as the number of accidents (injuries or casualties) per assessment period, i.e., man-hours (expressed in percentage). Similarly, a Likert scale could be employed to measure the level of safety (i.e., from low safety to high safety). These KPIs serve as a direct assessment of social sustainability for the sustainable development of HPs. The social sustainability of a hospital is reflected in its capacity to maintain people’s health and offer them a healthy life [82].
5. Integration of the Various Underlying KPIs
Although the three groups are separated, they do not occur in isolation. Rather, each group or category influences one another through its underlying KPIs. The interactions among the underlying KPIs were qualitatively depicted based on the consolidation of findings from the literature review (see Figure 3). Regarding the interactions among the various groups, the underlying KPIs of the category ‘project prosecution performance’ influenced one another within the same group, in addition to influencing other underlying KPIs in the other two categories. For example, delays in ‘construction time performance’ could influence ‘construction cost performance’ through cost overruns. Similarly, ‘flexibility and adaptability of facility’ could influence ‘maintenance effectiveness and efficiency’ of a facility [87]. Furthermore, ‘innovation and improvement’ of a facility will enhance ‘functional capacity and utilization’ and ‘maintenance effectiveness and efficiency’ [87] (see Figure 3). Moreover, ‘maintenance effectiveness and efficiency’ influences ‘functional capacity and utilization’. Keeping a healthcare facility in its functional state and good conditions demands effective and efficient maintenance [83].
Concerning the interaction among categories, the underlying KPIs of ‘project prosecution performance’ could influence the underlying KPIs of the category ‘project people performance’. ‘Functional capacity and utilization’ could affect ‘stakeholder/end-user satisfaction’ [87,88]. Likewise, an underlying KPI such as ‘construction safety performance’ could influence KPIs such as ‘construction time performance’ and ‘construction cost performance’. Accidents on construction sites could lead to loss of life or injuries which could also lead to loss of productive labor and job site closure by safety authorities for investigation. Consequently, these safety issues affect ‘construction time performance’ and ‘construction cost performance’. Moreover, ‘project purpose performance’ could influence ‘project prosecution performance’ and ‘project people performance’ via their underlying KPIs. ‘Construction quality performance’, for instance, influences ‘construction cost performance’ and ‘construction time performance’. Poor quality of work could lead to a cost of nonconformance to quality or a cost of poor quality. This is the cost associated with not satisfying the quality expectations of the project. Nonconformance to quality could result in reworks, which could cost the project additional time and money. The results of these could be cost and time overruns on the project. Additionally, ‘construction quality performance’ and ‘functional suitability’ in the category ‘project purpose performance’ could both influence ‘stakeholder/end-user satisfaction’ in the category ‘project people performance’ [89] (see Figure 3).
6. Demonstration of Model Application
A few steps must be followed to properly compute the composite HPSI for a particular HP using Equation (6). The linear and additive model enables the choice of different units of measurement for the KPIGs in a single assessment process. In this demonstration, the adapted unit of measurement for the KPIGs and KPIs was a seven-point Likert scale defined as 1 = very dissatisfied, 2 = dissatisfied, 3 = slightly dissatisfied, 4 = neutral, 5 = slightly satisfied, 6 = satisfied, and 7 = very satisfied [40]. Although this scale may be sub-optimal due to the inherent subjectivity of the assessment, it is useful for demonstration purposes now, pending the establishment of applicable objective scales in future research. Nevertheless, the subjectivity could be reduced by the choice of appropriate respondents, data collection methods (multi-round survey, e.g., Delphi, focus group etc.), and fuzzy analysis methods that account for the limitations of the human cognition processes. It is assumed that a multi-round Delphi survey was conducted on relevant stakeholders about their perceived (dis-)satisfaction levels with the 10 KPIs in two projects, A and B.
The score of each KPIG was derived from the average of the mean ratings of the comprising KPIs. For project A, the scores of KPIG 1, KPIG 2, and KPIG 3 were assumed to be 3, 4, and 5, respectively. For project B, the scores of KPIG 1, KPIG 2, and KPIG 3 were assumed to be 6, 7, and 6, respectively. These sets of scores were then separately substituted into the model to derive the overall HPSI values. For project A, the HPSI value was The result means that the surveyed stakeholders are neither dissatisfied nor satisfied with the success of project A. Regarding project B, the HPSI value was computed as Thus, the stakeholders are generally satisfied with the success of project B. The computations allow comparison and benchmarking of the success levels of HPs.
7. Conclusions and Recommendations
In this study, KPIs were identified toward developing a composite HPSI model to evaluate and compare life cycle success levels of HPs. A comprehensive literature review was first conducted to identify KPIs across the life cycle of HPs. Then, a Delphi survey was conducted among experts and a consensus was reached on the ten most essential KPIs for representing the life cycle of HPs. These included: ‘construction safety performance’, ‘stakeholder/end-user satisfaction’, ‘construction quality performance’, ‘functional suitability’, ‘construction cost performance’, ‘flexibility and adaptability of facility’, ‘construction time performance’, ‘maintenance effectiveness and efficiency’, ‘innovation and improvement’, and ‘functional capacity and utilization’. Three categories were developed from the KPIs, viz: ‘project prosecution performance’, ‘project purpose performance’, and ‘project people performance’. Through a fuzzy synthetic evaluation technique, a normalized composite HPSI was developed. The normalized indices of the three categories of indicators (in brackets) revealed their relative importance as follows: ‘project prosecution performance’ (0.287), ‘project purpose performance’ (0.353), and ‘project people performance’ (0.360). These indices were combined through a linear additive model to develop a composite HPSI evaluation model. The HPSI has both practical and theoretical implications. The HPSI can be used to track HP performance regarding the tipple bottom line of sustainable development. Moreover, it can be deployed for comparison or benchmarking of HPs for performance improvement, and, therefore, can be used to inform decision-making around HPs by policymakers and practitioners. Furthermore, through the relative importance of the indices, the HPSI seeks to inform policymakers and practitioners working on HPs on the allocation of project resources/efforts among the three categories of KPIs. For instance, among the three categories of KPIs, more resources (i.e., financial resources, material resources, and human resources) should be allocated to attain ‘project people performance’, followed by ‘project purpose performance’ and ‘project prosecution performance’. This prioritization concerning the allocation of resources among the KPI categories is based on the relative weightings.
Notwithstanding the relevance of the study, there are limitations which are worth noting. The interactions among the underlying KPIs were depicted based on the synthesis of findings from the literature. Future research could empirically assess the cause–effect relationships among the KPIs. The fuzzy synthetic evaluation technique could be laborious regarding the computation of indices. Accordingly, future study should seek to add other variables (i.e., quantitative indicators and ranges of KPIs) to current findings to develop a computerized success evaluation system for evaluating, monitoring, improving, and benchmarking the life cycle success of HPs in an objective, reliable, and practical manner. As this study demonstrated the model with hypothetical projects, it is recommended that future research incorporates case studies of HPs that have implemented the selected KPIs and achieved sustainability goals. Finally, this study focused on the perspectives of construction experts concerning the KPIs. Future studies could focus on the perspectives of other stakeholders such as patients, healthcare providers, and community members on the importance of sustainability indicators or KPIs in HP success.
Conceptualization, A.P.-C.C., M.-W.C. and A.D.; data curation, G.D.O.; formal analysis, G.D.O.; funding acquisition, A.P.-C.C., M.-W.C. and A.D.; investigation, G.D.O.; methodology, A.P.-C.C., M.-W.C. and A.D.; project administration, M.-W.C. and A.D.; supervision, A.P.-C.C.; validation, A.P.-C.C., M.-W.C. and A.D.; visualization, G.D.O. and M.A.A.; writing—original draft, G.D.O. and M.A.A.; writing—review and editing, A.P.-C.C. and A.D. All authors have read and agreed to the published version of the manuscript.
The study’s dataset will be made available to interested persons upon request.
The authors would like to thank the Delphi experts who contributed their HP experiences to the study.
The authors declare no conflicts of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 2. Framework of KPI categories for performance assessment of healthcare projects.
Search of publications in selected academic databases.
Academic Database | Search String/Synonymous Terms | Search Domain | Number of Results |
---|---|---|---|
Scopus | (“healthcare project” OR “healthcare center” OR “health center” OR “healthcare facility” OR “hospital” OR “clinic” OR “infirmary” OR “sanatorium” OR “medical center” OR “medical facility” OR “convalescent home” OR “convalescent facility”) AND (“success” OR “failure” OR “performance” OR “KPI” OR “benchmark” OR “efficiency” OR “effectiveness”) AND (“building project” OR “construction project” OR “infrastructure project” OR “engineering project” OR “construction industry”) | Title, abstract, and keywords | 291 |
Web of Science | (“healthcare project” OR “healthcare center” OR “health center” OR “healthcare facility” OR “hospital” OR “clinic” OR “infirmary” OR “sanatorium” OR “medical center” OR “medical facility” OR “convalescent home” OR “convalescent facility”) AND (“success” OR “failure” OR “performance” OR “KPI” OR “benchmark” OR “efficiency” OR “effectiveness”) AND (“building project” OR “construction project” OR “infrastructure project” OR “engineering project” OR “construction industry”) | All fields | 623 |
Google Scholar | (“healthcare project”, “healthcare center”, “health center”, “healthcare facility”, “hospital”, “clinic”, “infirmary”, “sanatorium”, “medical center”, “medical facility”, “convalescent home”, “convalescent facility”) AND (“success”, “failure”, “performance”, “KPI”, “benchmark”, “efficiency”, “effectiveness”) AND (“building project”, “construction project”, “infrastructure project”, “engineering project”, “construction industry”) | All fields | 419 |
Demographic information of the panel experts.
Demographic Characteristics | No. | % | Demographic Characteristics | No. | % |
---|---|---|---|---|---|
Professional background | Level of experience | ||||
Project/Construction Manager | 5 | 26.32% | 1–5 years | 9 | 47.37% |
Quantity Surveyor | 4 | 21.05% | 6–10 years | 5 | 26.32% |
Architect | 4 | 21.05% | 11–15 years | 1 | 5.26% |
Facility/Property Manager | 1 | 5.26% | Above 15 years | 4 | 21.05% |
Engineer | 3 | 15.79% | Total | 19 | 100% |
Hospital Administrator | 1 | 5.26% | |||
Medical Professional | 1 | 5.26% | Number of healthcare projects | ||
Total | 19 | 100% | 1–2 | 6 | 31.58% |
3–4 | 3 | 15.79% | |||
Sector of client | 5–6 | 3 | 15.79% | ||
Public | 15 | 78.95% | ≥6 | 7 | 36.84% |
Private | 5 | 26.32% | Total | 19 | 100% |
Quasi-public | 3 | 15.79% | |||
Phase of healthcare project | |||||
Planning phase | 15 | 78.95% | |||
Construction phase | 17 | 89.47% | |||
Post-construction phase | 10 | 52.63% |
Results on the selection of the most representative KPIs.
Round 1 | Round 2 | |||||
---|---|---|---|---|---|---|
Key Performance Indicators (KPIs) | Count | % | Rank | Count | % | Rank |
Planning and Construction Phases | ||||||
Construction quality performance | 17 | 89.47% | 1 | 16 | 84.21% | 1 |
Construction time performance | 16 | 84.21% | 2 | 16 | 84.21% | 1 |
Construction safety performance | 13 | 68.42% | 4 | 16 | 84.21% | 1 |
Construction cost performance | 15 | 78.95% | 3 | 15 | 78.95% | 4 |
Innovation and improvement | 11 | 57.89% | 5 | 13 | 68.42% | 5 |
Risk management effectiveness | 5 | 26.32% | 6 | 8 | 42.11% | 6 |
Teamwork and collaboration | 5 | 26.32% | 6 | 6 | 31.58% | 7 |
Change occurrence and magnitude | 4 | 21.05% | 8 | 5 | 26.32% | 8 |
Environmental performance | 3 | 15.79% | 10 | 4 | 21.05% | 9 |
Planning effectiveness | 4 | 21.05% | 8 | 4 | 21.05% | 9 |
Building codes adherence | 3 | 15.79% | 10 | 3 | 15.79% | 11 |
Conflict/dispute occurrence and magnitude | 3 | 15.79% | 10 | 2 | 10.53% | 12 |
Participant professionalism and competency | 3 | 15.79% | 10 | 2 | 10.53% | 12 |
Construction productivity | 1 | 5.26% | 18 | 1 | 5.26% | 14 |
Construction resource management | 1 | 5.26% | 18 | 1 | 5.26% | 14 |
Communication effectiveness | 2 | 10.53% | 16 | 1 | 5.26% | 14 |
Client/participant satisfaction | 3 | 15.79% | 10 | 1 | 5.26% | 14 |
* Defects rectifications and improvement extent owing to design afterthought | 1 | 5.26% | 18 | 1 | 5.26% | 14 |
Litigation occurrence and magnitude | 3 | 15.79% | 10 | 0 | 0.00% | 19 |
Claim occurrence and magnitude | 2 | 10.53% | 16 | 0 | 0.00% | 19 |
Scope of rework | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Long-term business relationships | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Harmonious working relationships | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Trust and respect | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Participant profitability | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Professional reputation/image attainment | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Human resource management | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Learning and development | 0 | 0.00% | 21 | 0 | 0.00% | 19 |
Post-Construction Phase | ||||||
Stakeholder/end-user satisfaction | 15 | 78.95% | 1 | 17 | 89.47% | 1 |
Functional suitability | 13 | 68.42% | 2 | 14 | 73.68% | 2 |
Maintenance effectiveness and efficiency | 10 | 52.63% | 3 | 14 | 73.68% | 2 |
Functional capacity and utilization | 10 | 52.63% | 3 | 12 | 63.16% | 4 |
Flexibility and adaptability of facility | 8 | 42.11% | 5 | 10 | 52.63% | 5 |
Service performance | 8 | 42.11% | 5 | 8 | 42.11% | 6 |
Resilience and sustainability of facility | 7 | 36.84% | 7 | 8 | 42.11% | 6 |
Energy utilization | 6 | 31.58% | 8 | 6 | 31.58% | 8 |
Long-term community/societal benefits | 4 | 21.05% | 10 | 4 | 21.05% | 9 |
Service lifespan of facility | 5 | 26.32% | 9 | 3 | 15.79% | 10 |
Maintenance interruptions to operations | 4 | 21.05% | 10 | 3 | 15.79% | 10 |
Operation and maintenance (O&M) expenditure | 3 | 15.79% | 13 | 2 | 10.53% | 12 |
O&M safety performance | 1 | 5.26% | 17 | 2 | 10.53% | 12 |
O&M organization/management effectiveness | 2 | 10.53% | 15 | 2 | 10.53% | 12 |
Healthcare culture/image embeddedness | 4 | 21.05% | 10 | 2 | 10.53% | 12 |
Maintenance time performance | 1 | 5.26% | 17 | 1 | 5.26% | 16 |
Facility condition | 3 | 15.79% | 13 | 1 | 5.26% | 16 |
Site/location optimization | 1 | 5.26% | 17 | 1 | 5.26% | 16 |
Facility integration into locality | 1 | 5.26% | 17 | 1 | 5.26% | 16 |
Commercial profitability/value | 2 | 10.53% | 15 | 0 | 0.00% | 20 |
O&M statutory compliance | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
Spare parts management | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
O&M policy/guideline deployment | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
O&M information management/sharing | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
Water and waste management | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
Current replacement value of facility | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
Visual appearance and appeal | 0 | 0.00% | 21 | 0 | 0.00% | 20 |
* Newly suggested KPI.
Results on the rating of the shortlisted KPIs.
Round 3 | Round 4 | ||||
---|---|---|---|---|---|
S/N | Shortlisted Key Performance Indicators (KPIs) | Mean | Rank | Mean | Rank |
KPI1 | Construction quality performance | 4.67 | 1 | 4.87 | 1 |
KPI2 | Construction safety performance | 4.60 | 3 | 4.80 | 2 |
KPI3 | Stakeholder/end-user satisfaction | 4.67 | 1 | 4.73 | 3 |
KPI4 | Functional suitability | 4.20 | 5 | 4.47 | 4 |
KPI5 | Construction time performance | 4.40 | 4 | 4.33 | 5 |
KPI6 | Construction cost performance | 4.00 | 7 | 4.13 | 6 |
KPI7 | Maintenance effectiveness and efficiency | 4.13 | 6 | 4.00 | 7 |
KPI8 | Functional capacity and utilization | 3.67 | 8 | 3.73 | 8 |
KPI9 | Innovation and improvement | 3.33 | 10 | 3.13 | 9 |
KPI10 | Flexibility and adaptability of facility | 3.53 | 9 | 3.13 | 9 |
N = 15 | |||||
Agreement level (Kendall’s W) | 0.395 | 0.594 | |||
Asymp. sig. | 0.000 | 0.000 | |||
Improvement in Agreement Level | 50.38% |
Summary of fuzzy analysis results.
Principal Component Factor Analysis | Mean Score | Weighting | Estimated Membership Function (MFs) | Index Value | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
(M) | (W) | ||||||||||||||||||||
S/N | Loading | Eigenvalue | % of Var. Expl. | Cum. % of Var. Expl. | KPI | KPIG | KPI | KPIG | MFs at Level 2 (KPIs) | MFs at Level 1 (KPIGs) | Index | Normalized | Rank (Grade) | ||||||||
KPIG1 | — | 4.39 | 43.91 | 43.91 | — | 22.467 | — | 0.544 | — | — | — | — | — | 0.05 | 0.05 | 0.18 | 0.50 | 0.22 | 3.803 | 0.287 | 3rd |
KPI9 | 0.895 | 3.133 | — | 0.139 | — | 0.13 | 0.00 | 0.47 | 0.40 | 0.00 | — | — | — | — | — | ||||||
KPI10 | 0.868 | 3.133 | — | 0.139 | — | 0.20 | 0.00 | 0.27 | 0.53 | 0.00 | — | — | — | — | — | ||||||
KPI6 | 0.848 | 4.133 | — | 0.184 | — | 0.00 | 0.07 | 0.07 | 0.53 | 0.33 | — | — | — | — | — | ||||||
KPI5 | 0.822 | 4.333 | — | 0.193 | — | 0.00 | 0.07 | 0.07 | 0.33 | 0.53 | — | — | — | — | — | ||||||
KPI7 | 0.817 | 4.000 | — | 0.178 | — | 0.00 | 0.07 | 0.13 | 0.53 | 0.27 | — | — | — | — | — | ||||||
KPI8 | 0.724 | 3.733 | — | 0.166 | — | 0.00 | 0.07 | 0.20 | 0.67 | 0.07 | — | — | — | — | — | ||||||
KPIG2 | — | 1.80 | 17.99 | 61.89 | — | 9.333 | — | 0.226 | — | — | — | — | — | 0.00 | 0.00 | 0.00 | 0.32 | 0.68 | 4.675 | 0.353 | 2nd |
KPI1 | 0.856 | 4.867 | — | 0.521 | — | 0.00 | 0.00 | 0.00 | 0.13 | 0.87 | — | — | — | — | — | ||||||
KPI4 | −0.802 | 4.467 | — | 0.479 | — | 0.00 | 0.00 | 0.00 | 0.53 | 0.47 | — | — | — | — | — | ||||||
KPIG3 | — | 1.44 | 14.36 | 76.25 | — | 9.533 | — | 0.231 | — | — | — | — | — | 0.00 | 0.00 | 0.03 | 0.17 | 0.80 | 4.767 | 0.360 | 1st |
KPI3 | 0.820 | 4.733 | — | 0.497 | — | 0.00 | 0.00 | 0.07 | 0.13 | 0.80 | — | — | — | — | — | ||||||
KPI2 | −0.717 | 4.800 | — | 0.503 | — | 0.00 | 0.00 | 0.00 | 0.20 | 0.80 | — | — | — | — | — | ||||||
Total mean for KPIGs | 41.333 |
Note: initial values in the parentheses; extraction method: principal component analysis; rotation method: varimax with kaiser normalization; rotation converged in five iterations; cum. = cumulative; var. expl. = variance explained, v. imp. = very important; and m. imp. = most important.
Supplementary Materials
The following supporting information can be downloaded at:
References
1. United Nations. World Population Prospects 2019: Highlights (ST/ESA/SER.A/423); United Nations: New York, NY, USA, 2019.
2. World Health Organization. World Health Statistics 2019: Monitoring Health for the SDGs, Sustainable Development Goals; World Health Organization: Geneva, Switzerland, 2019.
3. Organisation for Economic Co-Operation and Development (OECD). Investing in Health Systems to Protect Society and Boost the Economy: Priority Investments and Order-of-Magnitude Cost Estimates; Organisation for Economic Co-Operation and Development (OECD): Paris, France, 2022.
4. World Cities Culture Forum. The Creative Economy: A Cornerstone of Hong Kong’s Future; World Cities Culture Forum: London, UK, 2024.
5. Hospital Authority. Introduction; Hospital Authority: Hong Kong, 2024.
6. International Trade Administration. Hong Kong: Healthcare; International Trade Administration: Washington, DC, USA, 2024.
7. Legislative Council Panel on Health Services (LCPHS). Second Ten-Year Hospital Development Plan; LC Paper No. CB (2)1167/18-19(07); Hong Kong Government: Hong Kong, 2019.
8. Legislative Council Panel on Health Services (LCPHS). The First and Second 10-Year Hospital Development Plan; LC Paper No. CB (4)600/20-21(08); Hong Kong Government: Hong Kong, 2021.
9. World Health Organization. Hospitals; World Health Organization: Geneva, Switzerland, 2021.
10. GlobalData. Project Insight: Global Healthcare Construction Projects (Q2 2024); GlobalData: London, UK, 2024.
11. Sharma, V.; Caldas, C.H.; Mulva, S.P. Development of metrics and an external benchmarking program for healthcare facilities. Int. J. Constr. Manag.; 2021; 21, pp. 615-630. [DOI: https://dx.doi.org/10.1080/15623599.2019.1573490]
12. Soliman-Junior, J.; Tzortzopoulos, P.; Baldauf, J.P.; Pedo, B.; Kagioglou, M.; Formoso, C.T.; Humphreys, J. Automated compliance checking in healthcare building design. Autom. Constr.; 2021; 129, 103822. [DOI: https://dx.doi.org/10.1016/j.autcon.2021.103822]
13. Amos, D.; Au-Yong, C.P.; Musa, Z.N. The mediating effects of finance on the performance of hospital facilities management services. J. Build. Eng.; 2021; 34, 101899. [DOI: https://dx.doi.org/10.1016/j.jobe.2020.101899]
14. Lavy, S.; Garcia, J.A.; Dixit, M.K. Establishment of KPIs for facility performance measurement: Review of literature. Facilities; 2010; 28, pp. 440-464. [DOI: https://dx.doi.org/10.1108/02632771011057189]
15. Shohet, I.M. Key performance indicators for maintenance of hospital buildings. Proceedings of the CIB W070 2002 Global Symposium; Glasgow, Scotland, 18–20 September 2002; Volume 70, pp. 79-90.
16. Steinke, C.; Webster, L.; Fontaine, M. Evaluating building performance in healthcare facilities: An organizational perspective. HERD Health Environ. Res. Des. J.; 2010; 3, pp. 63-83. [DOI: https://dx.doi.org/10.1177/193758671000300207] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/21165871]
17. Rosacker, K.M.; Zuckweiler, K.M.; Buelow, J.R. An Empirical Evaluation of Hospital Project Implementation Success. Acad. Health Care Manag. J.; 2010; 6, pp. 37-53.
18. Ling, F.Y.Y.; Li, Q. Managing the development and construction of public hospital projects. IOP Conf. Ser. Mater. Sci. Eng.; 2019; 471, 022001. [DOI: https://dx.doi.org/10.1088/1757-899X/471/2/022001]
19. Adamy, A.; Abu Bakar, A.H. Developing a building-performance evaluation framework for post-disaster reconstruction: The case of hospital buildings in Aceh, Indonesia. Int. J. Constr. Manag.; 2021; 21, pp. 56-77. [DOI: https://dx.doi.org/10.1080/15623599.2018.1506903]
20. Chan, A.P.L.; Chan, A.P.C.; Chan, D.W.M. An empirical survey of the success criteria for running healthcare projects. Archit. Sci. Rev.; 2005; 48, pp. 61-68. [DOI: https://dx.doi.org/10.3763/asre.2005.4809]
21. Adamy, A. Disaster-Resilient Building: Lesson Learned from a Building Performance Evaluation of Meuraxa Hospital in Aceh, Indonesia. Resilient and Responsible Smart Cities; Springer: Berlin/Heidelberg, Germany, 2021; pp. 179-193. [DOI: https://dx.doi.org/10.1007/978-3-030-63567-1_16]
22. Gimbert, X.; Bisbe, J.; Mendoza, X. The role of performance measurement systems in strategy formulation processes. Long Range Plan.; 2010; 43, pp. 477-497. [DOI: https://dx.doi.org/10.1016/j.lrp.2010.01.001]
23. Amos, D.; Musa, Z.N.; Au-Yong, C.P. Performance measurement of facilities management services in Ghana’s public hospitals. Build. Res. Inf.; 2020; 48, pp. 218-238. [DOI: https://dx.doi.org/10.1080/09613218.2019.1660607]
24. Adamy, A.; Abu Bakar, A.H. Key criteria for post-reconstruction hospital building performance. IOP Conf. Ser. Mater. Sci. Eng.; 2019; 469, 012072. [DOI: https://dx.doi.org/10.1088/1757-899X/469/1/012072]
25. Lavy, S.; Garcia, J.A.; Dixit, M.K. KPIs for facility’s performance assessment, Part I: Identification and categorization of core indicators. Facilities; 2014; 32, pp. 256-274. [DOI: https://dx.doi.org/10.1108/F-09-2012-0066]
26. Amos, D.; Au-Yong, C.P.; Musa, Z.N. Developing key performance indicators for hospital facilities management services: A developing country perspective. Eng. Constr. Archit. Manag.; 2020; 27, pp. 2715-2735. [DOI: https://dx.doi.org/10.1108/ECAM-11-2019-0642]
27. Gómez-Chaparro, M.; García-Sanz-Calcedo, J.; Aunión-Villa, J. Maintenance in hospitals with less than 200 beds: Efficiency indicators. Build. Res. Inf.; 2020; 48, pp. 526-537. [DOI: https://dx.doi.org/10.1080/09613218.2019.1678007]
28. Lai, J.H.; Hou, H.C.; Edwards, D.J.; Yuen, P.L. An analytic network process model for hospital facilities management performance evaluation. Facilities; 2021; 40, pp. 333-352. [DOI: https://dx.doi.org/10.1108/F-09-2021-0082]
29. Wai, S.H.; Aminah, M.Y.; Syuhaida, I. Social infrastructure project success criteria: An exploratory study. Int. J. Constr. Manag.; 2013; 13, pp. 95-104. [DOI: https://dx.doi.org/10.1080/15623599.2013.10773218]
30. Iskandar, K.A.; Hanna, A.S.; Lotfallah, W. Modeling the performance of healthcare construction projects. Eng. Constr. Archit. Manag.; 2019; 26, pp. 2023-2039. [DOI: https://dx.doi.org/10.1108/ECAM-08-2018-0323]
31. Adabre, M.A.; Chan, A.P.; Darko, A.; Hosseini, M.R. Facilitating a transition to a circular economy in construction projects: Intermediate theoretical models based on the theory of planned behaviour. Build. Res. Inf.; 2023; 51, pp. 85-104. [DOI: https://dx.doi.org/10.1080/09613218.2022.2067111]
32. Adabre, M.A.; Chan, A.P.; Darko, A.; Edwards, D.J.; Yang, Y.; Issahaque, S. No Stakeholder Is an Island in the Drive to This Transition: Circular Economy in the Built Environment. Sustainability; 2024; 16, 6422. [DOI: https://dx.doi.org/10.3390/su16156422]
33. Chan, A.P.C.; Chan, E.H.; Chan, A.P.L. Managing Health Care Projects in Hong Kong: A Case Study of the North District Hospital. Int. J. Constr. Manag.; 2003; 3, pp. 1-13. [DOI: https://dx.doi.org/10.1080/15623599.2003.10773039]
34. Chan, A.P.L.; Chan, A.P.C.; Chan, D.W.M. A study of managing healthcare projects in Hong Kong. Proceedings of the 19th Annual ARCOM Conference; Brighton, UK, 3–5 September 2003; Greenwood, D.J. Association of Researchers in Construction Management: Brighton, UK, 2003; Volume 2, pp. 513-522.
35. Zuo, J.; Zillante, G.; Zhao, Z.Y.; Xia, B. Does project culture matter? A comparative study of two major hospital projects. Facilities; 2014; 32, pp. 801-824. [DOI: https://dx.doi.org/10.1108/F-02-2013-0014]
36. Choi, J.; Leite, F.; de Oliveira, D.P. BIM-based benchmarking for healthcare construction projects. Autom. Constr.; 2020; 119, 103347. [DOI: https://dx.doi.org/10.1016/j.autcon.2020.103347]
37. Buelow, J.R.; Zuckweiler, K.M.; Rosacker, K.M. Evaluation methods for hospital projects. Hosp. Top.; 2010; 88, pp. 10-17. [DOI: https://dx.doi.org/10.1080/00185860903534182] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/20194106]
38. Liu, W.; Chan, A.P.; Chan, M.W.; Darko, A.; Oppong, G.D. Key performance indicators for hospital planning and construction: A systematic review and meta-analysis. Eng. Constr. Archit. Manag.; 2024; ahead-of-print [DOI: https://dx.doi.org/10.1108/ECAM-10-2023-1060]
39. Do, D.; Ballard, G.; Tillmann, P. Part 1 of 5: The Application of Target Value Design in the Design and Construction of the UHS Temecula Valley Hospital; Project Production Systems Laboratory, University of California: Berkeley, CA, USA, 2015.
40. Chan, A.P.C.; Chan, A.P.L. Key performance indicators for measuring construction success. Benchmarking Int. J.; 2004; 11, pp. 203-221. [DOI: https://dx.doi.org/10.1108/14635770410532624]
41. Choi, J.; Leite, F.; de Oliveira, D.P. BIM-based benchmarking system for healthcare projects: Feasibility study and functional requirements. Autom. Constr.; 2018; 96, pp. 262-279. [DOI: https://dx.doi.org/10.1016/j.autcon.2018.09.015]
42. Ahmad, H.; Abdul Aziz, A.R.; Jaafar, M. Success criteria for design-and-build public hospital construction project in Malaysia: An empirical study. Appl. Mech. Mater.; 2015; 749, pp. 410-414.
43. Chan, A.P.L. Critical Success Factors for Delivering Healthcare Projects in Hong Kong. Ph.D. Thesis; Department of Building and Real Estate: Hong Kong Polytechnic University, Hong Kong, China, 2004.
44. Omar, M.F.; Ibrahim, F.A.; Omar, W.M.S.W. Key performance indicators for maintenance management effectiveness of public hospital building. MATEC Web Conf.; 2017; 97, 01056. [DOI: https://dx.doi.org/10.1051/matecconf/20179701056]
45. Talib, Y.; Yang, R.J.; Rajagopalan, P. Evaluation of building performance for strategic facilities management in healthcare: A case study of a public hospital in Australia. Facilities; 2013; 31, pp. 681-701. [DOI: https://dx.doi.org/10.1108/f-06-2012-0042]
46. Pei, Y. A framework of output specifications and evaluation method for hospital PPP projects. Open J. Bus. Manag.; 2019; 7, pp. 167-179. [DOI: https://dx.doi.org/10.4236/ojbm.2019.71012]
47. Gokhale, S.; Gormley, T.C. Construction Management of Healthcare Projects; McGraw-Hill Education: New York, NY, USA, 2014.
48. Edum-Fotwe, F.T.; Egbu, C.; Gibb, A.G.F. Designing facilities management needs into infrastructure projects: Case from a major hospital. J. Perform. Constr. Facil.; 2003; 17, pp. 43-50. [DOI: https://dx.doi.org/10.1061/(ASCE)0887-3828(2003)17:1(43)]
49. Lai, J.; Yuen, P.L. Performance evaluation for hospital facility management: Literature review and a research methodology. J. Facil. Manag. Educ. Res.; 2019; 3, pp. 38-43. [DOI: https://dx.doi.org/10.22361/jfmer/96267] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39817120]
50. Li, Y.; Cao, L.; Han, Y.; Wei, J. Development of a conceptual benchmarking framework for healthcare facilities management: Case study of shanghai municipal hospitals. J. Constr. Eng. Manag.; 2020; 146, 05019016. [DOI: https://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0001731]
51. Omar, M.F.; Ibrahim, F.A.; Omar, W.M.S.W. An assessment of the maintenance management effectiveness of public hospital building through key performance indicators. Sains Hum.; 2016; 1, pp. 51-56. [DOI: https://dx.doi.org/10.11113/sh.v8n4-2.1059]
52. Marzouk, M.; Hanafy, M. Modelling maintainability of healthcare facilities services systems using BIM and business intelligence. J. Build. Eng.; 2022; 46, 103820. [DOI: https://dx.doi.org/10.1016/j.jobe.2021.103820]
53. Shohet, I.M. Key performance indicators for strategic healthcare facilities maintenance. J. Constr. Eng. Manag.; 2006; 132, pp. 345-352. [DOI: https://dx.doi.org/10.1061/(ASCE)0733-9364(2006)132:4(345)]
54. Lai, J.; Yuen, P.L. Identification, classification and shortlisting of performance indicators for hospital facilities management. Facilities; 2021; 39, pp. 4-18. [DOI: https://dx.doi.org/10.1108/F-08-2019-0092]
55. Lai, J.H.; Hou, H.C.; Chiu, B.W.; Edwards, D.; Yuen, P.L.; Sing, M.; Wong, P. Importance of hospital facilities management performance indicators: Building practitioners’ perspectives. J. Build. Eng.; 2022; 45, 103428. [DOI: https://dx.doi.org/10.1016/j.jobe.2021.103428]
56. Hallowell, M.R.; Gambatese, J.A. Qualitative Research: Application of the Delphi Method to CEM Research. J. Constr. Eng. Manag.; 2010; 136, pp. 99-107. [DOI: https://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0000137]
57. Ameyaw, E.E. Risk Allocation Model for Public-Private Partnership Water Supply Projects in Ghana. Ph.D. Thesis; The Hong Kong Polytechnic University: Hong Kong, China, 2015.
58. Sourani, A.; Sohail, M. The Delphi method: Review and use in construction management research. Int. J. Constr. Educ. Res.; 2015; 11, pp. 54-76. [DOI: https://dx.doi.org/10.1080/15578771.2014.917132]
59. Manoliadis, O.; Tsolas, I.; Nakou, A. Sustainable construction and drivers of change in Greece: A Delphi study. Constr. Manag. Econ.; 2006; 24, pp. 113-120. [DOI: https://dx.doi.org/10.1080/01446190500204804]
60. Ibrahim, C.K.I.C.; Costello, S.B.; Wilkinson, S. Development of a conceptual team integration performance index for alliance projects. Constr. Manag. Econ.; 2013; 31, pp. 1128-1143. [DOI: https://dx.doi.org/10.1080/01446193.2013.854399]
61. Chan, A.P.C.; Yung, E.H.; Lam, P.T.; Tam, C.M.; Cheung, S.O. Application of Delphi method in selection of procurement systems for construction projects. Constr. Manag. Econ.; 2001; 19, pp. 699-718. [DOI: https://dx.doi.org/10.1080/01446190110066128]
62. Yeung, J.F.; Chan, A.P.C.; Chan, D.W.; Li, L.K. Development of a partnering performance index (PPI) for construction projects in Hong Kong: A Delphi study. Constr. Manag. Econ.; 2007; 25, pp. 1219-1237. [DOI: https://dx.doi.org/10.1080/01446190701598673]
63. Yeung, J.F.; Chan, A.P.C.; Chan, D.W. Developing a performance index for relationship-based construction projects in Australia: Delphi study. J. Manag. Eng.; 2009; 25, pp. 59-68. [DOI: https://dx.doi.org/10.1061/(ASCE)0742-597X(2009)25:2(59)]
64. Hsu, C.; Sandford, B.A. The Delphi technique: Making sense of consensus. Pract. Assess. Res. Eval.; 2007; 12, pp. 1-8.
65. Hasson, F.; Keeney, S.; McKenna, H. Research guidelines for the Delphi survey technique. J. Adv. Nurs.; 2000; 32, pp. 1008-1015. [DOI: https://dx.doi.org/10.1046/j.1365-2648.2000.t01-1-01567.x]
66. Ameyaw, E.E.; Hu, Y.; Shan, M.; Chan, A.P.C.; Le, Y. Application of Delphi method in construction engineering and management research: A quantitative perspective. J. Civ. Eng. Manag.; 2016; 22, pp. 991-1000. [DOI: https://dx.doi.org/10.3846/13923730.2014.945953]
67. Osei-Kyei, R.; Chan, A.P.; Javed, A.A.; Ameyaw, E.E. Critical success criteria for public-private partnership projects: International experts’ opinion. Int. J. Strateg. Prop. Manag.; 2017; 21, pp. 87-100. [DOI: https://dx.doi.org/10.3846/1648715X.2016.1246388]
68. Oppong, G.D.; Chan, A.P.; Ameyaw, E.E.; Frimpong, S.; Dansoh, A. Fuzzy evaluation of the factors contributing to the success of external stakeholder management in construction. J. Constr. Eng. Manag.; 2021; 147, 04021142. [DOI: https://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0002155]
69. Hair, J.F.; Anderson, R.E.; Tatham, R.L.; Black, W.C. Multivariate Data Analysis; 5th ed. Prentice Hall: Upper Saddle River, NJ, USA, 1998.
70. Zadeh, L.A. Fuzzy sets. Inf. Control; 1965; 8, pp. 338-353. [DOI: https://dx.doi.org/10.1016/S0019-9958(65)90241-X]
71. Tah, J.H.M.; Carr, V. A proposal for construction project risk assessment using fuzzy logic. Constr. Manag. Econ.; 2000; 18, pp. 491-500. [DOI: https://dx.doi.org/10.1080/01446190050024905]
72. Hu, Y.; Chan, A.P.C.; Le, Y.; Xu, Y.; Shan, M. Developing a program organization performance index for delivering construction megaprojects in China: Fuzzy synthetic evaluation analysis. J. Manag. Eng.; 2016; 32, 05016007. [DOI: https://dx.doi.org/10.1061/(ASCE)ME.1943-5479.0000432]
73. Singh, D.; Tiong, R.L.K. A fuzzy decision framework for contractor selection. J. Constr. Eng. Manag.; 2005; 131, pp. 62-70. [DOI: https://dx.doi.org/10.1061/(ASCE)0733-9364(2005)131:1(62)]
74. Boussabaine, A. Risk Pricing Strategies for Public-Private Partnership Projects; 1st ed. John Wiley and Sons: New York, NY, USA, 2014.
75. Lo, S.M. A fire safety assessment system for existing buildings. Fire Technol.; 1999; 35, pp. 131-152. [DOI: https://dx.doi.org/10.1023/A:1015463821818]
76. Gebremeskel, M.N.; Kim, S.Y.; Thuc, L.D.; Nguyen, M.V. Forming a driving index for implementing public-private partnership projects in emerging economy: Ethiopian perception. Eng. Constr. Archit. Manag.; 2021; 28, pp. 2925-2947. [DOI: https://dx.doi.org/10.1108/ECAM-06-2020-0459]
77. Ameyaw, E.E.; Chan, A.P.C. Critical success factors for public-private partnership in water supply projects. Facilities; 2016; 34, pp. 124-160. [DOI: https://dx.doi.org/10.1108/F-04-2014-0034]
78. Liu, J.; Li, Q.; Wang, Y. Risk analysis in ultra deep scientific drilling project—A fuzzy synthetic evaluation approach. Int. J. Proj. Manag.; 2013; 31, pp. 449-458. [DOI: https://dx.doi.org/10.1016/j.ijproman.2012.09.015]
79. Onkal-Engin, G.; Demir, I.; Hiz, H. Assessment of urban air quality in Istanbul using fuzzy synthetic evaluation. Atmos. Environ.; 2004; 38, pp. 3809-3815. [DOI: https://dx.doi.org/10.1016/j.atmosenv.2004.03.058]
80. Nunnally, J.C. Psychometric Theory; 2nd ed. McGraw-Hill: New York, NY, USA, 1978.
81. Phillips, J. CAPM/PMP Project Management All-in-One Exam Guide; McGraw-Hill, Inc.: New York, NY, USA, 2007.
82. Tushar, S.R.; Moktadir, M.A.; Kusi-Sarpong, S.; Ren, J. Driving sustainable healthcare service management in the hospital sector. J. Clean. Prod.; 2023; 420, 138310. [DOI: https://dx.doi.org/10.1016/j.jclepro.2023.138310]
83. Ebekozien, A. Maintenance practices in Nigeria’s public health-care buildings: A systematic review of issues and feasible solutions. J. Facil. Manag.; 2021; 19, pp. 32-52. [DOI: https://dx.doi.org/10.1108/JFM-08-2020-0052]
84. Fotovatfard, A.; Heravi, G. Identifying key performance indicators for healthcare facilities maintenance. J. Build. Eng.; 2021; 42, 102838. [DOI: https://dx.doi.org/10.1016/j.jobe.2021.102838]
85. Han, S.; Jeong, Y.; Lee, K.; In, J. Environmental sustainability in health care: An empirical investigation of US hospitals. Bus. Strategy Environ.; 2024; 33, pp. 6045-6065. [DOI: https://dx.doi.org/10.1002/bse.3790]
86. Adabre, M.A.; Chan, A.P.; Wuni, I.Y. Modeling Sustainable Housing for Sustainable Development in Cities and Communities: The Perspective of a Developing Economy. Circular Economy for Buildings and Infrastructure: Principles, Practices and Future Directions; Springer International Publishing: Cham, Switzerland, 2024; pp. 97-115.
87. Chan, A.P.; Adabre, M.A. Bridging the gap between sustainable housing and affordable housing: The required critical success criteria (CSC). Build. Environ.; 2019; 151, pp. 112-125. [DOI: https://dx.doi.org/10.1016/j.buildenv.2019.01.029]
88. Adabre, M.A.; Chan, A.P. Towards a sustainability assessment model for affordable housing projects: The perspective of professionals in Ghana. Eng. Constr. Archit. Manag.; 2020; 27, pp. 2523-2551. [DOI: https://dx.doi.org/10.1108/ECAM-08-2019-0432]
89. Adabre, M.A.; Chan, A.P. Critical success factors (CSFs) for sustainable affordable housing. Build. Environ.; 2019; 156, pp. 203-214. [DOI: https://dx.doi.org/10.1016/j.buildenv.2019.04.030]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Hospital projects or healthcare projects (HPs) are major contributors of greenhouse gas emissions, high energy consumption, and environmental pollution. These problems serve as a clarion call for the development of a standardized list of metrics that define the triple bottom line of sustainability performance, track sustainability progress, and allow for essential comparisons or benchmarking of HPs. Through a comprehensive literature review, a Delphi survey with experts, and a fuzzy synthetic evaluation, the ten most suitable key performance indicators (KPIs) were identified, categorized, and modeled into a normalized HP success index (HPSI). The HPSI comprises relatively weighted (in brackets) KPI categories, namely, ‘project prosecution performance’ (0.287), ‘project purpose performance’ (0.353), and ‘project people performance’ (0.360), for evaluating and comparing success levels of HPs. The HPSI provides understanding on the relative contribution levels of the standardized KPIs to achieve predictable life cycle success levels of HPs. Ultimately, it can be used by policymakers and practitioners to inform life cycle decision-making (e.g., resource/effort allocation toward important contributors to success) in HPs. Future studies should seek to develop a computerized HPSI system, by adding quantitative indicators and ranges of KPIs to current findings, to objectively and practically assess, monitor, benchmark, and improve HP success across the life cycle.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details



1 Department of Building and Real Estate, Hong Kong Polytechnic University, 11 Yuk Choi Road, Hung Hom, Kowloon, Hong Kong;
2 Department of Construction Management, University of Washington, Seattle, WA 98195, USA;