Abstract: This paper describes the development and application of a methodology used to build an instrument for collecting detailed user evaluations. Although the methodology is general, the instrument developed was specifically intended for collecting user evaluations pertaining to text editors. The methodology incorporated user suggestions and resulted in a matrix of 16 text editing functions and 17 adjective scales. The application of this matrix to an existing editor revealed that a consistent set of scales were appropriate for evaluating all 16 editing functions.
Keywords: Empirical studies; Evaluation, subjective; Evaluation; Software development; Survey; Text editors
Originally published: Proceedings of the Human Factors Society 29th Annual Meeting, 1985, pp. 240-244
Republished: G. Perlman, G. K. Green, & M. S. Wogalter (Eds.) Human Factors Perspectives on Human-Computer Interaction: Selections from Proceedings of Human Factors and Ergonomics Society Annual Meetings, 1983-1994, Santa Monica, CA: HFES, 1995, pp. 29-33.