Assessing surgical skill is critical in improving patient care while reducing medical errors, length of stay, and readmission rates. Crowdsourcing provides 1 potential method for accurately assessing this; only recently has crowdsourcing been studied as a valid way to provide feedback to surgeons. The results of such studies are explored.Data Sources:
A systematic literature search was performed on PubMed to identify studies that have attempted to validate crowdsourcing as a method for assessing surgical skill. Through a combination of abstract screening and full-length review, 9 studies that met the inclusion criteria were reviewed.Conclusions:
Crowdsourcing has been validated as an important way to provide feedback for surgical skill. It has been demonstrated to be effective in both dry-lab and live surgery, for a variety of tasks and methods. However, more studies must be performed to ensure that crowdsourcing can provide quality feedback in a wider variety of scenarios.