PURPOSE: To evaluate an open-ended, computer-scored testing format designed to overcome certain limitations of multiple-choice questions. METHOD: Test items covering content in family medicine were administered in two different formats to 7,036 resident physicians in 380 training programs, and to 35 experienced, board-certified physicians in conjunction with the In-training Examination of the American Board of Family Practice. Examinees completed a booklet of 40 open-ended, uncued (UnQ) test items by selecting the answer to each item from a list of over 500 responses. Similar items were administered using the standard multiple-choice question (MCQ) format. One year later, another test of 40 UnQ test items dealing with core content in family medicine was administered to 7,138 residents. RESULTS: Examinees completed over 560,000 UnQ responses with high compliance and few errors. Both reliability and validity for the UnQ format were higher than for the MCQ format, and the UnQ items discriminated more accurately among levels of physicians' experience. The UnQ format almost eliminated the possibility that the physicians could answer questions by sight recognition or random guessing, and it was particularly effective in measuring knowledge of core content. CONCLUSIONS: This study supports the feasibility of administering open-ended test items to enhance tests of physicians' competence.