An office computer workstation self-assessment checklist was found to offer good reliability and validity in a recent comparative study by Cai et al. involving 111 staff workers. When answering the same checklist question at two separate occasions, subjects provided the same answer 78 percent of the time (reliability test).
Agreement between a volunteer and an ergonomic expert (validity test) when evaluating workstation features (e.g., seat height is adjustable) occurred nearly 80 percent of the time. However, when assessing working postures (e.g., trunk is perpendicular to floor, supported by the back of the chair), a 64 percent agreement rate was found.
The authors created a 30-item on-line questionnaire generated from the content of two Federal OSHA computer workstation documents (http://www.osha.gov/SLTC/etools/computerworkstations/pdffiles/checklist1.pdf and http://www.osha.gov/SLTC/etools/computerworkstations/positions.html). Workstation features were addressed by 21 questions while 9 questions were orientated to working postures.
From 508 World Bank staff members randomly selected to participate in the study, 111 completed all requested activities.
The subjects were asked to complete the 30-item on-line computer workstation checklist twice within 5 days. An ergonomist performed an office ergonomics assessment on each of the 111 subjects using the same checklist. All activity was accomplished over a three month time period.
Checklist Reliability Analysis
Reliability was determined for each checklist item as an agreement percentage: number of subjects who gave a consistent answer to a question divided by the total number of subjects that responded to the question.
Checklist Validity Analysis
The ergonomist’s assessment of a checklist statement was accepted as accurate. Validity was established for each checklist item as an agreement percentage: number of subject-ergonomist consistent answers to a question divided by the total number of subject-ergonomist question responses.
Since each subject performed the checklist twice (and may not have given the same answer each time), the validity agreement percentage was calculated for each question for both the first and second self-assessment. The two values were averaged to produce the validity agreement percentage for each checklist statement.
The authors did not mention if volunteers were offered a training relative to application of the checklist.
Although derived from Federal OSHA documents, it is unclear if the checklist captured key office computer workstation concerns. A review process of the checklist questions conducted by professional peers would have addressed this issue.
This research was printed in the “letters to the editor” section of this journal. It is unlikely the content was peer-reviewed.
The Bottom Line
There is little research to support the reliability and validity of self-assessment surveys. By addressing this shortcoming, this checklist shows promise.
The authors feel their survey could replace the need for an office workstation assessment performed by an ergonomist. However, they concur that questions related to working postures need refinement. Also, confidence in using this evaluation method could be buttressed by having professional peers review questions to ensure significant computer workstation concerns are addressed.
Article Title: Can We Trust the Answers? Reliability and Validity of Ergonomic Self-Assessment Surveys
Publication: Journal of Occupational and Environmental Medicine, 49:10 1055-1059, 2007
Authors: X Cai, J G Laestadius, S Ross, and L Dimberg
This article originally appeared in The Ergonomics Report™ on 2008-01-30.