By David E. Hailey, Ph.D.
Professional and Technical Communication
Utah State University.
I suggest that while usability studies are becoming increasingly sophisticated in evaluating complex information systems, they are in danger of leaving content evaluation behind. In reent studies, I found that out of more than a hundred experienced professional writers, none could find egregious problems in simple Web pages. The problem seems to arise from a common inability among professional writers to identify the different genres they see in digital media. This chapter examines that problem and suggests solutions (tests) that can be introduced into usability studies or run in coordination with them.
One might reasonably ask, "Why does a chapter on rhetorical analysis belong in a book about usability studies in complex information systems?" The purpose of this chapter is to answer that question and the additional questions that answer implies. As bodies of information and their sources have become more complicated, our ability to evaluate them has been forced to evolve, and although usability studies experts have systematically developed processes that evaluate the structure, navigation, and design of these documents, I suggest that analysis of the quality of content has not kept up.
Contemporary, digital documents may be constructed from multiple resources archived around the world. These documents often interact in real time with individual readers using dossiers that a variety of computers may have compiled. These dossiers are often stored in databases alongside millions of similar dossiers describing other readers. For example, at Amazon.com, different customers see entirely different landing pages; CBSNews.com automatically includes my local news and weather on its homepage (the weather is lifted from a sensor at the local airport and the news is extracted from my local newspaper's database.