Skip to main content

Citation

Scrivener, S., & Weiss, M.J. (2009). Opening Doors: More guidance, better results? Three-year effects of an enhanced student services program at two community colleges. New York: MDRC.

Highlights

  • The study’s objective was to examine the impact of the Opening Doors program at Lorain County Community College and Owens Community College in Ohio. The Opening Doors program at these schools provided students with enhanced counseling services and a modest stipend for two semesters.
  • The study was a randomized controlled trial. Eligible students were randomly assigned to either the treatment group, which was eligible to receive Opening Doors counseling services and a stipend, or the control group, which was not offered these services. The primary data sources were a baseline survey on the background characteristics of students, students’ transcripts from the two colleges, and degree attainment information from the National Student Clearinghouse. The study reported outcomes for the two program semesters and four post-program follow-up semesters.
  • The study found that cumulatively from the first program semester to the third follow-up semester, the treatment group registered for significantly more semesters and earned significantly more developmental credits than the control group. There was no significant difference in the proportion of treatment group students who completed a degree or certificate relative to the control group.
  • The quality of causal evidence presented in this report is high because it was based on a well-implemented randomized controlled trial. This means we are confident that the estimated effects would be attributable to the Opening Doors program, and not to other factors.

Intervention Examined

Opening Doors Program at Lorain County Community College and Owens Community College

Features of the Intervention

The Opening Doors program at Lorain County Community College in Elyria, Ohio, and Owens Community College in Toledo, Ohio, involved the provision of enhanced counseling services over two semesters. Participants in the treatment group were assigned a counselor and were expected to meet with the counselor at least twice a semester to discuss academic progress and any school-related issues. Each counselor assigned to the treatment group had a smaller caseload (one counselor for every 81 and 157 treatment group students at Lorain and Owens, respectively) than counselors in the control group (one counselor for more than 1,000 students). The smaller caseload for Opening Doors counselors was intended to provide more intense and frequent contacts with students. In addition to the counseling services, participants assigned to the treatment group received a $150 stipend in both program semesters. Participants also had access to study groups with the counselors and tutoring sessions around midterm and final exams. The program operated at both colleges from fall 2004 through spring 2006.

To participate in the Opening Doors program, students had to be incoming freshman or have completed fewer than 13 credits, be ages 18 to 34, have family income below 250 percent of the federal poverty level, hold a high school diploma or general education development certificate, and not already hold an associate’s degree from an accredited college or university.

Features of the Study

This study was a randomized controlled trial, and randomization occurred at the student level. Eligible students who consented to participate in the study filled out a baseline data form. Then, the students were randomized either to the treatment group, which was offered the opportunity to receive Opening Doors counseling services and a stipend, or the control group, which could not receive Opening Doors services, but could access existing college services. Across both schools, there were 2,139 students in the study with 1,073 in the treatment group and 1,066 in the control group.

The authors used data from the baseline data form, students’ transcripts, and the National Student Clearinghouse to measure outcomes related to progress toward degree completion. Impacts were reported as the difference of means between the treatment and control groups, adjusted for the semester of random assignment and college. This study reported outcomes for six semesters: the two program semesters and four subsequent follow-up semesters.

Findings

  • The program group registered for 0.2 more semesters than the comparison group and earned 0.4 more developmental credits than the comparison group, over the cumulative period from the first program semester through the third post-program semester. Both of these differences were statistically significant.
  • In the first program semester, the program group earned 0.2 more developmental credits than the control group. This difference was statistically significant, but there were not any other significant differences between the program and control groups on course registration, credits attempted and earned, and grade point average during the first program semester. Additionally, the course withdrawal rate during the first program semester was 3.8 percentage points higher for the treatment group than the control group. In the second program semester, there were multiple significant positive findings: the course registration rate for treatment group members was 7 percentage points higher than the control group, and treatment group members attempted 0.7 more credits and earned 0.5 more credits than the control group. However, these differences did not persist across the three post-program semesters. Although the treatment-group attempted an average of 0.4 more credits than the control group in the third post-program semester, there were no significant differences observed in any of the three post-program semesters on course registration or credits earned.
  • There was no significant difference between the treatment and control groups on the proportion of students who completed a degree or certificate over the cumulative period from the first program semester through the third post-program semester.

Considerations for Interpreting the Findings

The authors noted that the Opening Doors program was implemented more intensively at Lorain County Community College than at Owens Community College. The counselors at Lorain had smaller caseloads and a higher proportion of treatment students at Lorain received one or more of their stipend payments (93 percent), compared with treatment students at Owens (86 percent), which might suggest that program uptake was higher at Lorain.

The study authors estimated multiple impacts on outcomes related to persistence toward completing a college degree. Performing multiple statistical tests on related outcomes makes it more likely that some impacts will be found statistically significant purely by chance and not because they reflect program effectiveness. The authors did not perform statistical adjustments to account for the multiple tests, so the number of statistically significant findings in this domain is likely to be overstated.

Causal Evidence Rating

The quality of causal evidence presented in this report is high because it was based on a well-implemented randomized controlled trial. This means we are confident that the estimated effects would be attributable to the Opening Doors program, and not to other factors.

Reviewed by CLEAR

December 2014

Topic Area