Communication Home
 
 
 


College Board publishes test bias study authored by CORE Research Director Joy Matthews-López

by Jennifer Kowalewski

Before she ever came to OU-COM, Joy Matthews-López, Ph.D., worked on cutting-edge research in the area of educational testing. Now, three years after leaving the Educational Testing Service (ETS), one of the works she co-authored has been published by the prestigious College Board.

“It is a nice piece of research,” says Matthews-López, Centers of Osteopathic Research and Education (CORE) research director. “The College Board, owner of the SAT, is a very competitive setting to be published in. It was exciting to be accepted.”

Matthews-López worked as a measurement statistician for ETS, a premier testing company overseeing SAT, GRE, Praxis as well as other professional and educational assessment programs. While at ETS, Matthews-López did her primary work on a Spanish adaptation of the verbal section of the Graduate Management Admissions Test (GMAT).

She studied cultural and linguistic issues and then adapted questions to make the test fairer for all test takers.

Her paper, “Using DIF Dissection Method to Assess Effects of Item Deletion,” was published in 2005 with co-authors Yanling Zhang and Neil J. Dorans. The researchers presented their findings at the National Council on Measurement in Education in 2003 in New Orleans.

And although she left ETS behind, Matthews-López will use the research she helped create to make better testing procedures for medical students at OU-COM.

“We wrote several papers looking at fairness from different perspectives,” she said. “Even with the best intentions, bias can be written into tests.”

In this setting, bias refers to questions considered unfair to a specific gender or race, based on their experiences. Matthews-López used the example of sports-related questions to assess math skills. The question may appear “fair” for males taking the test, but females that are not sports oriented may have difficulty answering the question, even though they may have the skills necessary to do so. The context of the question can bias the response. Thus, a sports-related math question may be biased against females.

Differential Item Function (DIF) is a statistical procedure used to screen standardized test items for fairness violations. Matthews-López’s article looked at DIF from a new perspective, in that both gender and “race” were considered simultaneously.

Although previous DIF research addressed bias based on race and gender separately, Matthews-López and the other researchers looked at how certain questions would affect how the respondents answered, taking into account both their race and gender. In other words, would a question be unfair to certain populations based on their gender and race. The research looked at respondents’ gender as well as their race. In all cases of review, groups were defined in terms of reference (majority) or focal (minority) status, and all items were reviewed in this light.

Also, the study had to take into account “Equally-able” test takers, which refers to test takers that have similar skills or ability. Typically, this is measured by the total raw score on a test. Unfortunately, if a test contains many biased items, then the validity of the total score is called into question.

“And that is the only way we are able to group folks into a given score group,” she said.

They initially discovered there were biases inherent in questions for certain genders and races when these two factors were considered together. For instance, African American women may be more likely to get a certain question wrong than other equally able women and African American men.

Matthews-López would like to see more research done in the field, furthering the study she did.

In addition to directing the CORE Research Office, Matthews-López is currently working on an assessment project for OU-COM. By lending psychometric expertise to the existing testing program at OU-COM, some of the ideas and screening procedures used at professional testing organizations can be applied to the college and as a result, improve the testing conditions and policies for the medical students.

“I think it’s an example of the quality people we have at OU-COM,” says Keith Watson, D.O., associate dean for postgraduate medical education. “It is a nice contribution to the literature of research out there. We can take pride in her work.”

Matthews-López has published other research papers and has presented works at national and international conferences. “Test adaptation: General guidelines and suggestions for test development” was presented at an annual meeting of the National Council on Measurement in Education in Montreal and “Using differential person functioning to detect aberrant response patterns in a standard-setting session for teachers licensure” at the Midwest Educational Research Association (MWERA) meeting in Columbus.

She also published “Navigating the Review Process for Research Involving Human Participants: An Overview and Practical Guidelines” in the Ohio Research and Clinical Review (2004) and presented “Differential Speediness; A look in to subgroup differences” at MWERA (2003).

 
  Office of Communication
Ohio University Heritage College of Osteopathic Medicine
231 Grosvenor Hall, Athens, Ohio 45701
Tel: 740-593-2333 FAX: 740-593-2320
Copyright Ohio University (Home)
Last updated: 08/13/2012