HF Perspectives on HCI
Call for Reviewers
Document URL: http://www.acm.org/~perlman/hfeshci/call.html
- IF YOU
- think there have been good HFES annual meeting papers on HCI,
- are an HFES Member or Fellow
- are in the CSTG,
- have reliable email, and
- are willing to volunteer a few hours of your expertise,
- THEN PLEASE
- read the following project description and
- respond via email to volunteer to:
hfes-hci@cis.ohio-state.edu
Table of Contents
Introduction
The HFES has approved a proposal to create
a collection of the best papers
on Human-Computer Interaction
from the HFES annual meetings from 1983-1994.
This will tend to focus on papers from the Computer Systems
Technical Group sessions, but papers from all sessions will
be considered.
A maximum of about 75 papers will be selected,
the main constraint being an absolute limit of
400 printed pages.
A companion disk with searchable material is being considered.
This is an important opportunity to make
the best work of the HFES much more accessible to
HCI researchers, practitioners, educators, and students.
It will show the HCI community the sound work
(particularly methodological and empirical)
for which the HFES is (or should be) known,
providing a meaningful contrast to related organizations
(e.g., ACM SIGCHI).
World-Wide Web users can access information on the project at:
http://www.acm.org/~perlman/hfeshci
which includes a link to the original project proposal.
The editors (Gary Perlman, Georgia Green, Mike Wogalter)
are enlisting the help of many reviewers
to rate the quality, lasting contribution, and area
of contribution of previously published papers.
The ratings will be used to rank the papers,
and from the highest ranked,
the collection will be created.
We hope to have the collection published
in time for the 1995 annual meeting in San Diego.
This requires an aggressive schedule
and the cooperation of many people, especially the reviewers.
We need your help!
Please read the following review plan,
and if you are willing to volunteer,
send email along with your preferred postal address to:
hfes-hci@cis.ohio-state.edu.
Reviewers will (unless they request otherwise)
be acknowledged in the preface
and in any electronic versions of the collection.
Review-Related Tasks
Create List of Potential Reviewers
Because of the tight schedule,
only the following reviewers are being recruited:
- Only members and fellows of the HFES
(we want to draw from an experienced group of reviewers);
- From those, only CSTG members
(although papers may be drawn from any area);
- From those, only those with working email addresses
(which we will use to enlist reviewers, send instructions,
and gather reviews);
- From those, only members in the US and Canada
(because we plan to exchange some paper mail);
- And from those, only volunteers who can provide
their reviews by the end of June.
Although the list seems restrictive, there are about 400
qualified reviewers in the HFES, although not all will
have the time or inclination to volunteer.
Reviewers outside the U.S. and Canada with reliable
email and a complete set of proceedings back to 1983
would be welcomed, but mailing delays and costs
would strain both our schedule and budget, respectively.
Send (via Email) A Call for Reviewers
Using the email list generated above,
we hope to enlist the help of 50-100 volunteer reviewers.
Will want multiple independent reviews for hundreds of candidate papers.
Pre-Filter the Papers for Relevance
The HFES proceedings for the annual meetings from
1987-1994 are already represented in the
HCI Bibliography
with online abstracted bibliographic entries.
The authors and titles for the preceding years will be OCR scanned
so that they can be manipulated online
by the editors and later by the reviewers.
The following will be filtered out automatically:
- Abstracts only (single page abstracts)
- Demonstrations descriptions
- Panel descriptions
- Short papers
The three editors will go through the remaining papers
from the designated years and rate each for
for relevance to current HCI research and practice.
Papers that are determined to be relevant
by a majority of the editors will be placed in the review pool.
We estimated initially that there would be a pool of 300
relevant papers.
Choose and Document Review Criteria
The review criteria will be chosen to
help identify the HCI-relevant papers of which the HFES
would be most proud to provide as a showcase.
We will probably include five-point scales on the overall quality
and relevance to current HCI research and practice.
Some elaborations of these criteria include:
- The paper must be unique to HCI,
not work that could be in another field,
such as cognitive or social psychology
- The paper should contribute to helping build a better
(e.g., more usable) computer system
- The paper should not be about
just using a computer; the computer or software
should be the object of study
- The paper should demonstrate basic principles,
not something that will go out of date
- The paper should represent the HFES well;
it should not be methodologically flawed
and its presentation should be, if not
exemplary, at least not an embarrassment
Along with the ratings of quality,
we will also ask for a set of keyword phrase identifiers
from a list we will provide,
as well as any other keywords thought to be useful.
These will be used to categorize the papers chosen for inclusion,
but we would also like to use the keywords to aid online search,
so we will want to gather high-quality keywords for each paper rated.
Finally, we will be asking for comments to support the ratings.
Comments about the methodology, significance, relevance to today's
technology, and other attributes that go into an overall rating
will be useful to make decisions about inclusion,
particularly if the reviews for a paper vary a lot.
Send Papers to Reviewers
We want to get three reviews per paper,
multiplied by an estimate pool of 300 papers, requires 900 reviews!
A random sample of papers, possibly spanning many different years,
will be assigned to each reviewer,
ideally, avoiding obvious conflicts of interest.
We plan to send printed copies so that the reviewers
do not have to (1) locate, (2) retrieve, and (3) carry
copies of the proceedings
(the difficulty of these tasks is one reason why we think
the collection is a good idea).
We have not yet determined the number of papers that will
need to be reviewed, but that in part will depend on the
number of volunteers.
The effort to review these papers should be less than
ordinary reviews because:
- The editors will have already done some filtering,
so the paper should be more relevant than average.
- Many of the papers will already be familiar to the reviewers.
- The papers have already been reviewed and accepted,
so they should have been evaluated for technical soundness.
- The papers have been completed,
so most are in better shape than proposals.
- You will not need to write detailed reviews
for the authors because the authors will not
have the opportunity to modify the papers
(although we will consider an author's request to
exclude a paper from consideration).
We would appreciate a few words to supplement
ratings of quality and relevant, of course.
This is to say, "We will be assigning you a bigger pile
of papers to review that you might expect, but the reviews
should not take you nearly as long as you might fear."
Receive Reviews from Reviewers
All reviews will be emailed to:
hfes-hci-reviews@cis.ohio-state.edu
using electronic templates that will be provided in
conjunction with the distribution of papers for review.
Go to HF Perspectives on HCI: Home Page