Evidence-based design (EBD) refers to the process of making design decisions for the healthcare built environment based on what research has proven effective, all in the hope of achieving the best outcomes possible. And while tools are available to help measure the effectiveness of EBD principles, they’ve not been universally adopted by organizations.

To that end, a research team decided to implement a post-occupancy evaluation (POE) tool on a major academic medical center’s new bed tower. The results are shared in the case study “Utilization of a Standardized Post-occupancy Evaluation to Assess the Guiding Principles of a Major Academic Medical Center” by Zack Altizer, William J. Canar, Dave Redemske, Francis Fullam, and Mike Lamont. It’s published in the latest issue of Health Environments Research & Design Journal (HERD).

In the report, the team notes that many challenges have hindered universal adoption of POE. For one, architecture firms frequently have internal models that aren’t shared with the market at large nor are used on every project. Additional hurdles include the difficulty of replication since most POEs are customized for a specific project, meaning there’s an inherent lack of standardization that makes it difficult to generalize or reproduce approaches across organizations.

However, this gap narrowed in 2015, when The Center for Health Design introduced a Patient Room Design Checklist and Evaluation Tool, which provides a standardized POE survey to assess facilities and their implemented design features. The reliability and validity of the tool had not been assessed, though, so this case study became a pilot effort.

Data collected in the POE consisted of 23 reviews conducted by eight auditors across three rooms. The rooms were designed with identical layouts, except one room that served as a negative pressure room. The internal benchmark for the study was to achieve an overall score of greater than 75 percent. Results showed the bed tower achieved only a passing score on 16 of the 23 possible EBD goals, resulting in an overall score of 69.9 percent, meaning it didn’t meet the predetermined threshold.

But the main intent of the study was to pilot the use of the checklist itself to see if the research team could begin to develop an internal and reproducible POE process based on The Center’s tool. Some shortcomings of the process were noted, though, most significant being low inter-rater reliability highlighting a lack of training for participants as well as the subjective nature of the questionnaire itself.

“There were several limitations to our effort worth highlighting, with the most prominent being the following: reliability and validity of the CHD questionnaire, the need for more clinical personnel in the audit, and the development of a more robust training program to help standardize auditing practices,” the study reads.

Looking forward, the team identified opportunities, as well, speculating that more success may be seen if auditors were given not just more training but feedback on assessments until results are consistent with a model assessment. The case study also suggests future efforts might investigate how auditors would use the survey in unfamiliar surroundings (auditors were already familiar with the patient rooms assessed).

The full case study is available exclusively to Healthcare Design readers on the HERD website until Jan. 5, 2019. To read it, visit https://bit.ly/2zbyYxy.