From The Ergoweb® Learning Center

Research: Computer Workstation Self-Assessment Shows Promise

An office computer workstation self-assessment checklist was found to offer good reliability and validity in a recent comparative study by Cai et al. involving 111 staff workers.  When answering the same checklist question at two separate occasions, subjects provided the same answer 78 percent of the time (reliability test).

Agreement between a volunteer and an ergonomic expert (validity test) when evaluating workstation features (e.g., seat height is adjustable) occurred nearly 80 percent of the time.  However, when assessing working postures (e.g., trunk is perpendicular to floor, supported by the back of the chair), a 64 percent agreement rate was found.

Study Design
Checklist Description
The authors created a 30-item on-line questionnaire generated from the content of two Federal OSHA computer workstation documents (http://www.osha.gov/SLTC/etools/computerworkstations/pdffiles/checklist1.pdf and http://www.osha.gov/SLTC/etools/computerworkstations/positions.html).  Workstation features were addressed by 21 questions while 9 questions were orientated to working postures.

Subjects
From 508 World Bank staff members randomly selected to participate in the study, 111 completed all requested activities.  

Experiment Procedure
The subjects were asked to complete the 30-item on-line computer workstation checklist twice within 5 days.  An ergonomist performed an office ergonomics assessment on each of the 111 subjects using the same checklist.  All activity was accomplished over a three month time period.

Checklist Reliability Analysis
Reliability was determined for each checklist item as an agreement percentage: number of subjects who gave a consistent answer to a question divided by the total number of subjects that responded to the question.

Checklist Validity Analysis
The ergonomist’s assessment of a checklist statement was accepted as accurate.  Validity was established for each checklist item as an agreement percentage: number of subject-ergonomist consistent answers to a question divided by the total number of subject-ergonomist question responses.

Since each subject performed the checklist twice (and may not have given the same answer each time), the validity agreement percentage was calculated for each question for both the first and second self-assessment.  The two values were averaged to produce the validity agreement percentage for each checklist statement.

Other Findings
Regarding reliability:

  • The most reliable workstation feature question was “Headset for phone” at 100 percent agreement
  • The least reliable workstation feature question was “Keyboard/mouse tray, if provided, is large enough to hold a keyboard and a mouse” at 40 percent agreement
  • The most reliable working posture question was “Head, neck, and trunk face forward – are not twisted” at 83 percent agreement
  • The least reliable working posture question was “Shoulders and upper arms are in-line with the torso, relaxed, not elevated or stretched forward” at 66 percent agreement

Regarding validity:

  • The workstation feature question demonstrating the most validity was “Headset for phone” at 96 percent agreement
  • The workstation feature question demonstrating the least validity was “Top of the screen is at or below eye level so you can read it without bending your head or neck down/back” at 43 percent  agreement
  • The working posture question demonstrating the most validity was “Head, neck, and trunk face forward – are not twisted” at 87 percent agreement
  • The working posture question demonstrating the least validity was “Wrists and hands are straight – not bent up/down or sideways toward the little finger” at 44 percent agreement

Study Concern/Limitations
The authors did not mention if volunteers were offered a training relative to application of the checklist. 

Although derived from Federal OSHA documents, it is unclear if the checklist captured key office computer workstation concerns.  A review process of the checklist questions conducted by professional peers would have addressed this issue. 

This research was printed in the “letters to the editor” section of this journal.  It is unlikely the content was peer-reviewed.

The Bottom Line
There is little research to support the reliability and validity of self-assessment surveys. By addressing this shortcoming, this checklist shows promise.

The authors feel their survey could replace the need for an office workstation assessment performed by an ergonomist.  However, they concur that questions related to working postures need refinement.  Also, confidence in using this evaluation method could be buttressed by having professional peers review questions to ensure significant computer workstation concerns are addressed.

Article Title: Can We Trust the Answers? Reliability and Validity of Ergonomic Self-Assessment Surveys

Publication: Journal of Occupational and Environmental Medicine, 49:10 1055-1059, 2007

Authors: X Cai, J G Laestadius, S Ross, and L Dimberg

This article originally appeared in The Ergonomics Report™ on 2008-01-30.