Index Evaluation Workshop By Laura A. Ewald Spring 2013
While the term “peer review” can send a chill up the spine of any writer, I think it is never more so than among indexers. We are a hardy lot, but we work solo most of the time, with our brains, expertise and personalities wholly engaged in every project, and though every indexer will expect to find some mistakes in indexes past, we would prefer to find those “oops” moments ourselves rather than advertise the fact that we’re not perfect, especially to our peers.
And yet, how can we get better at what we do, if we don’t evaluate our work? And as in anything else creative, how can we as “author,” really evaluate an index we have written? As both a writer and proofreader/editor in my other life, I know for a fact that an author is generally too close to the work to catch all the mistakes. As an indexer, I know that an author is too close to his or her work to index it effectively, that a fresh set of eyes—belonging to a professional indexer—is needed to view and evaluate the work in order to make the index the best it can be for the reader.
Well, at the spring 2013 Heartland Chapter meeting in Indianapolis, attendees were given the opportunity to be both the reviewer and the “reviewee” in the peer review process thanks to Margie Towery’s[i] great workshop, Evaluating an Index Together. We gathered at the MCL Restaurant & Bakery meeting room for this two-part workshop based on the Wilson Award Index Evaluation Worksheet,[ii] using the Wilson Award Short-form Checklist as our guide.[iii]
One of the things that Margie clarified for us is that evaluation is not the same thing as editing. Evaluation means to look at the big picture, things like the overall usability, coverage, analysis, and style, while editing is really looking at getting the little details right. Certain items on the checklist are important to both, of course: accuracy in spelling, alphabetization and cross-references will certainly be edited, but how these things contribute to the way the index works for the reader is a part of the evaluation.
Margie began the workshop by handing out a copies of one of her own indexes from the early 1990s, which gave attendees the opportunity to look at an index Margie had done when she was not far from where most of us are in our indexing now. The whole group then worked to apply the Wilson Award Short-form Checklist in order to evaluate for the Mechanics, Substance, and Elegance of the index.
Mechanics refers to accuracy, cross-references, double-postings and flipped entries, format and layout, locators, names and terms, and style. The substance category includes analysis, coverage, creative problem solving, main headings, subheadings, terms and access, and usability. Elegance is the most subjective of the three areas of evaluation. It includes overall impressions, such as visual appeal, readability, and consistency, but it also touches on the intangibles, such as “graceful simplicity” and “precise richness.” Pretty heady stuff, isn’t it? And not always easy to apply, but we could recognize it in the several examples of previous Wilson Award-winning indexes Margie brought to share with the group.
After lunch, we broke into small groups, which Margie chose based on the subjects of the books for which we had each provided an index for this exercise. (Pre-meeting publicity asked us to each bring three copies of either an index we had written or, if we had not yet written one, then an index we had used in a published book.) We then took time to use the Wilson Checklist to evaluate each of the indexes individually and then discussed our findings within the small groups. This was the scary part! But as Margie herself says, “Index evaluation is the most important way to continue to improve one's indexing, no matter the stage you're at or the type of material you index. It is, of course, scary, as several comments on the evaluations note. Maybe that is why the practice has declined.”
What an eye-opener this small-group exercise was! Having a good idea of what we were looking for, based on the morning’s group discussion of Margie’s early index, we could delve into these new indexes, Wilson Checklist in hand, with at least an idea of where to start. And yet, everyone in my small group still went at the evaluation from a different angle, and the feedback I received on my own index, pointed things out to me—both good and bad—that can only help me with the next one.
Finally, we reconvened as a large group, and each small group shared our findings. Again, the range of what we focused on in our evaluations was broad, and gave us all something to add to our own internal evaluation checklist. Some attendees focused on the headings and cross-references of their index, others the usability, and still others the style. One of the most interesting points made was about an index which used headings that were questionable and/or confusing to those of us who didn’t know the topic well, but the person who brought the book explained that for the book’s audience, they were perfect and, in fact, enhanced the usability of the index. (See “terms and access” under “Substance”!)
“The key,” Margie says, is "to learn the concept and then apply it.” She goes on to explain,
“Unlike most other indexers, I learned indexing in an apprenticeship. The most valuable thing was that my mentor took her red pencil to my indexes and indicated what was wrong or could be handled better, as well as alternatives to consider—and then, most importantly: why.
I believe that sort of evaluative practice, really delving into the contents and structure of my indexes, thinking about alternatives to the way I'd done it, and learning about the underlying reasons, is what made me a good indexer (along with a willingness to play with words; be creative; use my intuition; analyze, digest and simplify; and disregard the rules when it makes an index more usable)."
Would Heartland Chapter members agree? According to the results of the workshop survey, the answer is a resounding, “Yes!” When given the statement, “The morning demonstration of the index evaluation process was helpful,” 77% strongly agreed and 23% agreed, and when given the statement, “The directed peer review portion will help me be a better indexer,” a resounding 85% strongly agreed and 15% agreed. The only negative feedback came in response to the statement, “The pacing and time allocated to index evaluation was adequate,” to which only 24% responded with “strongly agree,” 38% agreed, but 38% disagreed! The Heartland members wanted more! Scary? Yes! Useful as a teaching tool? A resounding “Yes!” As one attendee put it, “I need this kind of feedback! Aside from great conversations, we all feel a bit vulnerable about having our indexes evaluated—this makes for a fun dynamic and keeps us in a humble, teachable frame of mind.”
And a final note from Margie:
“As indexers, we are all open to learning new stuff with each project we tackle. We are willing to learn new technology to keep up. We attend workshops about other indexes, indexing techniques (e.g., for editing, names, specific materials), project and time management, and even ergonomics. Index evaluation, which gets at the specific core of an indexer's practice in a way that editing (or looking at bits and pieces of indexes) cannot, is absent.”
For members of the ASI Heartland Chapter who attended the spring 2013 meeting, index evaluation and the peer review process are now being seen in a whole new light. With comments like, “Necessary, indispensable,” and “Critical for my own quality control, a great use of time,” it is clear this group of indexers, at least, will be first in line the next time a workshop on index evaluation or an opportunity for a peer review session is offered.
[i] Margie Towery won the H. W. Wilson Award for Excellence in Indexing in 2002, for the Cumulative Index to The Letters of Matthew Arnold (6 volumes), edited by Cecil Lang, and published by the University of Virginia Press, and in 2008 for the index to The History of Cartography, volume 3: Cartography in the European Renaissance, parts 1 and 2, published by the University of Chicago Press. She served on the Wilson Committee in 2011, 2012, and 2013 (as incoming chair, chair, and outgoing chair, respectively), and conducted two Mock Wilsons, one for the Heartland Chapter and one for the Chicago Chapter.
[ii] The award has been renamed ASI/EBSCO Publishing Award.
[iii] Complete criteria for the ASI/EBSCO Publishing Award are listed here.
Jim Fuhr, Linda Presto, and Mary Peterson review each other's indexes
Cherry Delaney, Margie Towery, and Sue Klefstad discuss their evaluation insights