Using ACT Online Prep To Improve Score Gains

Test preparation plays an important role in high-stakes standardized testing. While test preparation companies may claim large, at times unrealistic, gains associated with product use, much of which can be very costly, scientific research supports a more moderate impact of test preparation. These impacts should not, however, be understated as
even small improvements in test scores can make a difference for college admissions and scholarship eligibility.

ACT offers a number of test preparation opportunities for students who are planning to take the ACT test. ACT’s test preparation suite of solutions includes the Official ACT Prep Guide, ACT Academy, ACT Online Prep (AOP), ACT Rapid Review Live, ACT Rapid Review On Demand, and ACT Rapid Review All Access. Each of these solutions caters to different student learning styles and strategies.

In this study, we focus on student’s usage of AOP. AOP provides two learning paths for students. The first is structured, comprehensive coverage of all subjects at all levels regardless of student's starting point. The second is an adaptive learning plan based on input from diagnostic tests to identify areas of weakness to target first. AOP presents official ACT test materials in a format that ensures students can cover all content areas tested.

In particular, it affords access to over 200 hours of content including over 2,400 practice items. The current study sought to examine the score gains students have attained between official ACT test administrations when using AOP between the tests.

Author: Edgar Sanchez, PhD

Edgar Sanchez, a senior research scientist in the Validity and Efficacy Research department at ACT, works on predictive modeling of student educational outcomes. He is currently focusing on the efficacy of test preparation programs.


This study highlighted the positive association between using ACT Online Prep and ACT score gains over time. In particular, this study demonstrates the positive association with greater numbers of active days in AOP, the number of practice sessions done, the number of full-length practice tests completed, and the number of system resets. The association of each AOP usage variable with ACT score gain was estimated using a multiple linear regression model that controlled for several student characteristics as well as all other AOP usage variables.

The results provide an estimate of the difference in ACT score gain between not using AOP and using it "optimally". For example, we can estimate that if a students used AOP for over 21 days, reviewed over 55 lessons, took at least 16 practice sessions and at least two practice tests, and reset the system between 6-10 times they would have an average gain score that would be 1.51 points higher than if they did not use the AOP system.

Students who purchased AOP, yet did not use the product, also saw gains in their ACT retest score, as would be expected with the additional instruction that occurs between tests. The average gain for students who did not use AOP was 1.13, while the average gain for students who use AOP optimally is 2.64 (i.e., 1.13 + 1.51).

There are certain limitations to this study that should be borne in mind. The most important is that this analytical method only allows us to speak of the association between gain scores and using features of AOP. Based on this study, however, causal claims cannot be made. Rather, this study helps us to understand the experience students are having with their AOP usage. 

A second limitation is that information on additional test preparation activities students may have been involved with were not available. It is possible that AOP was the sole or primary activity performed between the two test dates, but it is also possible that it served as a complimentary activity to other test preparation activities. If students who did not use AOP were more likely to engage in other types of test preparation not captured by this study, the study likely underestimates the impact of AOP usage.

An attempt to capture some of this variability in other test prep activities was made by including the number of months elapsed between the two tests in our model. It was thought that as the number of months between the two tests increased, the opportunity for other activities, including school learning, increased.

Another point to consider was that by using an ANCOVA approach that included all types of usage simultaneously, we see the incremental effects of each, but these estimates are likely smaller than if a model was estimated with only one type at a time. Notwithstanding these limitations, this study demonstrates the positive association between AOP usage and ACT Composite score gains over time.