Join ArborBridge’s Weekly News Flash and receive the latest headlines in test prep and college admissions every Tuesday


We are committed to providing you with the most up-to-date resources and announcements from the college admissions testing landscape. Here are some of the top headlines from this past month:


ACT Computerized Test Debuts with Some Issues

Summary: Last month, ACT administered it’s first wide-scale computer-based test (CBT) for international students. There were a handful of issues according to Inside Higher Ed: at least 39 test sites were cancelled at the last minute because they weren’t ready to administer the exam and some students were not able to complete the exam at other sites due to technical issues. On social media we saw reports from test-site administrators that when they opened for testing on test day, the ACT’s computer system experienced glitches that prevented them from checking in students, resulting in canceled exams. And our own students reported small tech glitches that disrupted but didn’t cancel testing (such as the entire site’s computers shutting down midway through a section). In these cases student progress was saved and timers paused so students could resume testing when systems restarted. Our students also reported non-technical hiccups (such as proctors unsure of how to deal with issues or generally unfamiliar with the new exam format).

What this means: 

  • Despite theses issues the majority of students were able to take the exam.
  • But the prevalence of issues (both big and small) reinforces the fact that any students taking the ACT internationally must have a back up test date and plan in mind. They must also be prepared for the glitches that may disrupt their test experience and have the tools to get back into the test mindset once the test resumes.
  • For international students, ArborBridge will continue to (more often than not) recommend SAT over ACT until we see a more consistent CBT experience and fewer widespread issues.

Glitches on ACT (Inside Higher Ed)


Removal of the Wrong-Answer Penalty Reduces Gender Gap in Test Performance

Summary: A recent study found that removing the penalty for wrong answers on multiple-choice exams reduces the gender gap in test performance by 9%, according to the Harvard Business Review. The study used data from Chile, where a large-scale policy change removed the penalty for wrong answers on the national college entry exam. When the penalty was still in place, women were significantly more likely than men to skip questions, but when the penalty was removed, the gender gap disappeared. The paper argues that the wrong-answer penalty may result in “worse test scores for women than equally knowledgeable men.”

What this means: This is a complicated story and requires a bit of unpacking.

  • While this study does not look at the standardized tests our students take, it is an interesting study for Americans nonetheless. There are still a handful of exams in the American system that use a guessing penalty (SAT Subject Tests and SSAT are the most popular). The AP and SAT did away with guessing penalties in the last few years primary for popularity reasons. This new data provides convincing evidence that there are other great reasons to eliminate guessing penalties.
  • We don’t expect the Subject Tests to necessarily follow suit at this point. The College Board has largely ignored updating the Subject Tests while it focused on SAT and AP revamps, and there are even whispers that the Subject Test may go away as colleges stop requiring them. It’s not likely the College Board will invest in redoing the Subject Tests and putting the work into recalibrating the tests without a guessing penalty until the Subject Tests’ future is more secure.

The Impact of Penalties for Wrong Answers on the Gender Gap in Test Scores(Harvard Business School)


College Board Adds Optional Service-Learning Component to All AP Courses

Summary: Last year, the College Board (in collaboration with WE) ran a successful pilot study integrating a service-learning component in 6 courses at a variety of test schools. As part of the component, AP students participate in a project within the WE framework, and “apply their academic learning to real-life settings and situations by being active in meaningful community-based service.” This year, the program will be available to every school and in every AP course. Participation in the program is up to a student’s school and teacher.

What this means: 

  • It’s not clear yet how extensive participation will be, but with a growing emphasis in education on real-world application and civics education, this option may be very popular among certain schools.
  • The one element that could detract from the popularity of such a program is the time spent preparing for the final AP exam. Teachers may find it difficult to add yet another component to the school year when they are already cramped to finish teaching content for the exam. We may find that this project is popular with AP courses in school districts that don’t end the school year until June; these schools may use service learning to fill the gap between the AP exam in May and the end of the school year.

Thousands of Students Will be able to Engage in Service Learning While Taking AP(College Board)

AP with WE Service(College Board)