The University System of
Georgia (USG), in an effort to raise statewide degree attainment to 60% by
2025, has launched an ambitious set of system-led student success projects. This
work began in earnest in 2011 with the Complete College Georgia (CCG)
initiative, in partnership with Complete College America, and has expanded over
the last two years to include the Momentum Year, a
collection of first-year efforts designed to improve student persistence
through college to graduation:
Making a purposeful program choice
Creating a productive academic mindset
Attempting the first 30 hours of a clear pathway
Attempting 9 hours in the Academic Focus Area
Completing initial English and math courses
Now we are shifting from the Momentum Year to the Momentum
Approach, which seeks to incorporate these same principles throughout a
student’s academic career. This work also includes the recent system-wide shift
to co-requisite remediation, away from the traditional learning support model
whereby remedial courses and college-level courses are taken sequentially.
Research and data support has been integral to the
development and implementation of USG’s student success initiatives. Through
standard reporting, research and analysis, and technical support, USG staff have
provided the critical work necessary to inform and sustain system-wide student
success efforts. This post highlights a few of these key efforts, demonstrating
the vital role of SHEEO agency data offices, data collection and reporting
infrastructure, as well as data analyst and business intelligence staff in
accomplishing system- and statewide goals.
Our readily available collection of standard reports has
helped equip system leaders and staff to discuss the CCG and Momentum Approach efforts
in meaningful ways. Leveraging our existing reporting infrastructure — a team
of business intelligence analysts and researchers as well as our data warehouse
built for standard reporting — we provide a variety of standard reports and ad
hoc reports to support system meetings, presentations, conferences, workshops,
and the like. We also produce a series of reports that include longitudinal,
disaggregated data on a variety of postsecondary outcomes for each institution (retention,
graduation, degrees conferred, credit hour accumulation, learning support
completion, etc.). This work supports campus reporting and analysis related to
CCG initiatives, especially for our smaller institutions that have limited
institutional research capacity. We are currently developing interactive
visualizations of the CCG data to better support system and institutional leaders.
In-depth analysis and research have helped demonstrate to
various stakeholders the benefits of co-requisite remediation as opposed to
traditional learning support models (known as Foundations in the USG). This
work shows that regardless of academic preparation, students who participated
in co-requisite learning support were more likely to earn a passing grade in
gateway English and math courses relative to Foundations courses (see Figure 1 below). In fact, gateway course
pass rates for co-requisite learning-support students were most similar to those
for students who did not have learning support requirements.
Likewise, our analytic work has highlighted the advantages of taking a fuller course load in the first year. Through a propensity score matching analysis — a replication of a study from the Community College Research Center — we learned that USG students who attempt at least 30 hours in the first year are 13 percentage points more likely to graduate within six years than if they attempt fewer than 30 credits. Moreover, this analysis demonstrated that regardless of academic preparation, taking a fuller course load was beneficial to all students (see Figure 2 below). In fact, the marginal benefit of a fuller course load was most pronounced among the least academically prepared.
Technical support provided by system office researchers,
business intelligence analysts, data warehouse architects, and learning
management system (LMS) experts has enabled the implementation of co-requisite
remediation by automating learning support placement and building the course
registration processes. The shift to co-requisite remediation system-wide was
accompanied by a paradigm shift in the placement of
learning support students and necessitated a new approach to course
registration to ensure simultaneous enrollment in remedial and gateway courses.
We also had to consider how to capture this new information in the Academic Data
Collection (ADC) in ways that would support reporting and analysis of these
initiatives. To accomplish this, we considered what senior leaders would
ultimately want to know about the new learning support model and how we could
conduct rigorous analyses to determine its effectiveness. Bringing these
questions into the data entry and collection phases has helped prepare us for future
reporting and analytic work.
We took the same approach when adding in data elements
related to students’ academic focus area, one of the Momentum Approach
elements. Adding in the focus area to student registration data allows academic
advisors to guide course selections better. We will soon begin collecting this
new data element at the system level, further enhancing our reporting and
analysis on system-wide student success initiatives.
Lastly, we have provided technical support in the form of
secure storage of sensitive information linking student surveys to
administrative data. Since Fall 2017, the USG has partnered with Motivate Lab at the University of Virginia to
develop and administer a system-wide mindset survey to incoming first-year
students. The purpose of the survey is to better understand students’
motivations in attending college, mindsets around math and English (in other
words, what students believe about their learning capabilities), as well as an
inventory of scarcity-related items (food insecurity, housing, ability to pay
for college, etc.). System office research staff have supported this work by
securely storing the survey data as well as linking survey responses to
administrative data, allowing USG and Motivate Lab analysts to determine how mindset,
motivation, and scarcity relate to a variety of postsecondary outcomes. Read
more about this work here.
Reporting, analysis, technical expertise, and the intricate
ways in which these efforts work together, are critical for the development and
implementation of large-scale efforts such as the USG’s Momentum Approach.
affordability through the implementation of various financial aid programs has
been the goal of not only the federal government, but also of state
governments, numerous philanthropic organizations, and many individual
postsecondary institutions. Financial
aid data, especially metrics regarding unmet need, can be very helpful in
determining the effectiveness of financial aid awarding strategies on college
affordability. The University of Hawai‘i has developed several metrics around
the use of EFC bands to assist in determining how successful aid-awarding
policies are so adjustments can be made quickly and easily by financial aid
Average Net Cost
Students with lower family incomes (and lower Free Application for
Federal Student Aid [FAFSA] expected family contributions) are generally more
price conscious and often represent a disproportionate number of lower-income
and minority students. Policymakers
and college administrators must understand the role of financial aid within
different contexts to promote college opportunity and degree completion for all
students. We often look at financial aid metrics that describe net cost,
usually in terms of average net cost for all students or average net cost for
all students receiving grant assistance, to better understand the impact of
with so much money from various sources going into gift-assistance for
deserving and financially-needy students and so much data on average student
unmet need available, many students — especially those who are first-generation,
college-going, and historically underrepresented — have misunderstandings about
the true net cost of college and continue to think a college degree is unaffordable
for them. Maybe that’s because average
net cost is far from some individual students’ actual situations. Average
student net cost means little to students who have very high or very low family
incomes, or very high or very low EFCs. Although we have learned a lot from looking at net
cost for students based on total family income ranges, the University of
Hawai‘i campuses realize that examining financial aid and net cost in terms of
FAFSA EFC ranges might be even more helpful.
Why EFC Ranges?
A student’s EFC is an indexed measure as a result of the federal
methodology formula used in the FAFSA calculation. The EFC (rather than family
income alone) represents a family’s financial position regarding its ability to
pay for college, and is used by Title IV postsecondary institutions to
determine eligibility for federal financial aid. A student’s EFC is often also
used by many institutions to award institutional aid. Average
net cost determined by EFC ranges makes sense because it takes into account a family’s
adjusted gross income, and it takes other important factors into consideration,
such as family size. An independent student with a family size of two with an
adjusted gross income of $30,000 could have a higher FAFSA-determined EFC than
a dependent student in a family of six with two in college and a total family
AGI of $80,000. Since aid is generally awarded based on EFCs (rather than family
income) and because a student’s EFC takes into account a number of other
factors that represent different scenarios for students’ household incomes, why
not look at net price in terms of EFC ranges as a measure of affordability?
Impact on Financial Aid Strategies
metrics on average net cost by EFC ranges could be very useful to community
college leadership, enrollment management staff, and financial aid personnel in
making decisions regarding financial aid strategies to encourage enrollment and
persistence toward degree completion. This
approach would contribute to data-driven
decision-making about effective financial aid awarding practices because this submetric will help to disaggregate
students in terms of net cost and do so in a way that is easily convertible
into financial aid packaging parameters since that is how most financial aid is
awarded. With the
assistance of the National Center for Higher Education Management Systems (NCHEMS),
the University of Hawai‘i’s Institutional Research & Analysis Office, for
example, uses students’ FAFSA EFC in terms of ranges to look at unmet need
after scholarships and grants are awarded. Using this information, along with a
historical look at the distribution of aid in various need-based financial aid
programs, helps financial aid officers identify how to redistribute aid
according to which EFC bands of students need more aid to reduce the gap in
unmet need. In this way, financial aid can be more strategically distributed
quickly and easily since the EFC and ranges of EFC are frequently used in
awarding various types of financial aid. Even more could be done regarding
leveraging financial aid resources if we had comparisons of unmet need in terms
of EFC ranges in state data systems so we could all learn from each other in
The state of
Minnesota invests heavily in higher education, both through our appropriations
for need-based aid, and financial support for our public colleges and
universities. Despite these investments, no good measure exists of whether or
not these investments are making college more affordable, and for how many. How
can we define college affordability?
could be considered affordable for families able to write a check to cover all
costs, as well as for those with a family income so low that 100% of their
costs are covered by state and federal aid. A broader sense of affordability
could be that you leave college with manageable debt, and your education helps
you secure a job with a paycheck that makes those loan payments manageable.
college affordability defined by a student’s career goals? A two-year community
college is affordable, but what if a public research university best fits the
student’s needs and aspirations? Without adequate resources, the student may
not be able to attend this public four-year college.
policymakers must look at college affordability through a broader lens — are we
investing enough taxpayer dollars to extend opportunities for college
enrollment and success to enough students? How many is “enough”? And is the
percentage of income and assets required from families and students reasonable
or too burdensome?
In 2018, the Minnesota Office of Higher Education (OHE) began
the task of defining affordability, both for students and state lawmakers. This
work will culminate in the development of guiding metrics to set goals for
future financing for both system appropriations and financial aid.
and students, defining college affordability should be about the math — are you
able to purchase the necessary and appropriate education and at the same time have
enough money to cover essential needs such as food and housing? If the math
doesn’t work, the student is at an increased risk of dropping out or may simply
For the state, affordability should take into account, by
income, the share of individuals who can afford to enroll. In this calculation,
limited state resources and political will are in competition with the growing
need for educated workers and the value of investing enough to encourage
completion, not just enrollment.
individuals and the state, affordability is measured over time — when a student
begins their education, over the student’s lifetime, and during the loan
at the student’s entrance asks this question: Does the student have the
required level of resources to fully pay the cost of attendance on day one? If
the cost of attendance exceeds their resources (which include grants, work,
family contributions, and loans), that college choice is not affordable for
that student. It is critical that this measurement is based on a reasonable
number of hours the student must work, as well as a reasonable family
contribution, and student loans. If these three factors don’t align, the
student may be able to go to college, but they are less likely to complete and
if they do, are more likely to have unmanageable debt.
On the other
hand, if resources exceed the cost of attendance, this could indicate that the
state is not being efficient in using state resources. This is a moving
benchmark that must be reconsidered as costs or resources change.
In the Minnesota Office of
Higher Education’s view, state
affordability is achieved when:
A typical family can afford 50%+ of
educational options available to them, and
A typical family can afford 50%+ of
local educational options available to them (colleges near where they live).
student’s lifetime, the return on investment for students and families can best
be measured by comparing their net earnings after college that can be
attributed to their education, to the net cost of their education. For
students, ideally, the net earnings over the first 10 years post-college would
exceed the net cost of college — a positive return on investment.
areas of study where there is less likely to be a positive return on
investment, such as for early childhood educators, legal aid attorneys, and
culinary workers. In this case, state policymakers might want to weigh the demand
for occupations such as these and consider additional subsidies or alternative
training modules to make them more affordable.
of college should be measured by the cumulative debt burden a student carries
after college, measured by the percentage of income required to fully pay off their
debt in 5-10 years. If the percentage of income is too high, the borrower is at
risk of default, in which case, college cannot be considered affordable for
that student. However, the state’s impact on student borrowing is less direct.
Students and families decide when to borrow and how much to borrow within
federal guidelines and private borrowing options available to them.
To have the most
significant impact, state policy decisions must focus on the first measure, affordability
at entry. If this is achieved, a positive return on investment is much more
likely, as well as a manageable debt burden. The challenge to states in
ensuring that students have adequate resources on day one is to understand
which students currently do not.
the information gathered from a detailed state-level analysis of affordability
metrics will be humbling, but it also should inspire us to meet the very real
and growing challenge of college affordability. Organizations such as SHEEO,
and its member states and postsecondary systems, work to meet this challenge
and, collectively, we are making a real difference.
In 2014, President Obama announced the launch of the Free Application for Federal Student Aid (FAFSA) Completion Initiative to help states, districts, and schools give students the support they need to complete the FAFSA form. Often the gateway to accessing financial aid for college, career school, or graduate school, the FAFSA gives students access to the nearly $150 billion in grants, loans, and work-study funds that the federal government has available. And in many cases, students are required to submit the FAFSA before they are considered for any financial aid.
In 2016, Ohio started participating in the FAFSA Completion Initiative, and today it is
the first step of the state’s 3 to Get Ready campaign, coordinated by the Ohio
Department of Higher Education (ODHE) to help high school students prepare for
college by focusing on applying for financial aid, submitting college
applications, and selecting where they’ll pursue their postsecondary education.
ODHE shares with school districts specific, limited information about the
students who have completed the
FAFSA form. The data enables
school and district partners to identify those students who have not filed a
FAFSA form and better target counseling, filing help, and other resources for those
students. These efforts can promote college access and success by ensuring that
students, particularly those with a low family income, have access to financial
aid to fund their education.
ODHE, in partnership with Management Council of the Ohio
Education Computer Network (MC OECN), provides data through a secure web portal.
The MC OECN K-12 Portal hosts many district initiatives, including assessment
information and teacher evaluation data, so districts are familiar with the
One of the challenges in Ohio is that districts are legally prohibited from reporting
students’ names, parents’ or other family members’ names, and addresses or Social
Security numbers to the Ohio Department of Education. This makes matching the
FAFSA data to the high school a difficult task that requires using specific data
elements from the application. Using the Institutional Student Information
Record (ISIR) data that the state receives, the National Center for Education
Statistics (NCES) high school code is matched with the crosswalk to the high
school and district at ODHE. Next, the data is filtered by date of birth, and
the system only forwards applications for students who are 22 years old or
younger. Updates of FAFSA completions are sent using a secure web service each
Wednesday, so the list is refreshed weekly from mid-October through June.
District superintendents must electronically sign a data release and security agreement to gain access to the FAFSA student-level information (screenshot below). There are 462 (out of more than 700 local education agencies in the state) data-sharing agreements on file, so more than half of the districts are participating.
Once the data-sharing agreement is signed, the districts use
a simple web interface with student name, date of birth, and high school to
review the students’ FAFSA completion in their district. If the NCES high
school code on the FAFSA application is missing, the student goes into a
statewide list of “unmatched” schools so the districts can search by name and
date of birth. The data are available for download to Excel for ease of
To date, the percentage of FAFSA completions has not changed
much, but the number of districts participating in the initiative keeps
growing. As part of Ohio’s attainment
goal that 65% of Ohioans, ages 25-64, will have a degree, certificate, or
other postsecondary workforce credential of value in the workplace by 2025, the
FAFSA Completion Initiative is one component of Ohio’s plan to reach that
Institutions of higher education
continue to enhance their capacity to develop and evaluate strategies for
promoting college access and success. Recent studies from Texas and the University of Michigan are prominent examples
of how institutional data can be leveraged to better understand how financial
aid policies improve college attainment for low-income students. However, financial
aid data use is uneven across institutional contexts. A major barrier is the
lack of coherent federal policy governing how information generated from the financial
aid process can be used to understand how aid and economic factors impact
college completion and student success outcomes. This blog post provides an
overview of these issues and considers the state’s role in developing effective
models for financial aid data use.
As demonstrated in SHEEO’s most recent Strong Foundations survey (and briefly summarized in the table below), coverage of financial aid metrics in state data systems is relatively lower and varies more across states than other types of student data. Fewer states and systems report tracking their students’ student-debt loads as compared to tracking workforce data on postgraduate earnings, which are much newer and are typically sourced externally from state workforce agencies. National survey data from institutional research offices paint a similar picture. A majority of these units reported having limited or no access to financial aid data and limited involvement in studies involving student financial aid modeling or student debt.
Coverage of Student Data Elements Collected in State Data Systems (Public Four-year Institutions)
These findings are
incongruent with the well-deserved focus on college affordability in state and
federal policy discussions today. Many states and colleges are developing
programs and strategies to address financial barriers in order to promote better
and more equitable student outcomes. Evaluating information about students’
ability to pay for college and the effectiveness of aid programs in addressing
these challenges should be part of that solution. However, prevailing interpretations
of federal law suggest that institutions’ limited use of financial aid data is
actually by design. Uncertainty about allowable use cases involving
identifiable financial aid data limits their integration with other student
data streams and their use in evidence-based decision-making.
Federal Financial Aid Data Use Policy
While the data use
framework established under the Family Educational Rights and Privacy Act (FERPA)
applies to financial aid data, other laws further restrict the use of data
collected as part of students’ participation in federal financial aid. Most
importantly, the last reauthorization of
the Higher Education Act (HEA) in 2008 added a provision that data collected on the
FAFSA could only be used “for the application, award, and administration of aid”
with the goal of protecting student privacy. Three years later, the U.S.
Department of Education (ED) regulated explicit research and evaluation
exemptions in FERPA, but the HEA use restrictions were not considered at that
time. While the HEA signals elevated privacy protections for FAFSA data,
whether and how the HEA restrictions should apply to institutional research activities
and program evaluation remains unresolved over a decade later.
After increasing calls
from the field for clarity, in 2017, ED’s Privacy Technical Assistance Center
(PTAC) issued FAQ guidance. ED’s guidance
promotes an expansive regulatory interpretation of the HEA provision in two critical
ways that, intentionally or not, effectively close off most uses of identified institutional
financial aid data. First, award information derived from the FAFSA is also
considered “FAFSA data.” This makes virtually all student-level financial aid
data elements subject to the HEA use restrictions, including measures like Pell
eligibility, a common proxy for identifying lower-income students, as well as
state and institutional aid awards information that is derived from the FAFSA. Second,
institutions may only conduct evaluations and analyses that are “necessary to
the efficient and effective administration of student aid.” While the
boundaries of such activities are not explicitly addressed by the PTAC guidance,
they have been further interpreted by
the National Association of Student Financial Aid Administrators (NASFAA) as potentially excluding
activities that use student financial aid data to gauge the effectiveness of
financial aid programs aimed at increasing completion rates and addressing
This has created
challenges for institutions in their routine uses of financial aid information as
well as for those seeking to use the data more intentionally to serve students.
Specific examples of the negative impact of the non-regulatory guidance on campus
and system operations were outlined in a letter to ED
last fall. It has also thrown into question their inclusion in statewide and systemwide
data systems, even though previous PTAC guidance
affirmed such sharing and uses to support data-driven
decision-making. Even processes necessary for compliance with state and federal
mandates are being disrupted, such as reporting to the federal Integrated
Postsecondary Data System (IPEDS) and federal grant reporting. These dynamics
may also have a chilling effect on data
sharing for educational research involving university-based and external faculty, like the examples from
Texas and Michigan shared above.
Despite the potentially
far-reaching regulatory impact of this guidance, it was issued informally and not
widely publicized. This is something the Government Accountability Office (GAO)
complex and require more consultation with the field to ensure that data uses
in support of educational programs are not adversely impacted.
For those who work in and around higher education research and policy who are either a) totally unaware of these issues or b) confronting barriers in this area seemingly out of the blue, you are in good company. But if the goals of effective data use policy are to protect student privacy and to balance those interests with promoting legitimate research activities for institutional improvement, the status quo does neither particularly well.
a Student-Centered Approach to Financial Aid Data Use: A Role for States
While well-intentioned, the current state of financial aid data use policy is problematic for students and the institutions and states that serve them. Safeguarding the privacy and confidentiality of student and parent data in the financial aid process is an important policy goal and should not be diminished, especially given the compulsory nature and the sensitivity of financial data collected by the FAFSA. But restricting states and institutions from most uses of financial aid data neglects the key role that states play in financing higher education, as well as the realities that many states, colleges, and students face in financing college costs. It also undermines goals of institutional accountability, transparency, and the accuracy of external reporting, including to the federal government through IPEDS. The federal government now collects outcomes data on programs like the Pell Grant and Stafford Loan Programs and its interest in these data are tied to Title IV, but high aid states like California are investing even more in lower-income students. States and institutions have a compelling interest in these data and are in a better position to connect the dots. This reality should be reflected in federal law and policy as part of a more student-centered approach to financial aid data use.
In conjunction with
other stakeholders, state systems and SHEEOs are well positioned to negotiate
these trade-offs and provide leadership in building a more student-centered
vision for financial aid data use. As stewards of longitudinal and linked data
systems and the primary organizational connection between public colleges and
state and federal policy, they are a logical place to operationalize models for
responsible data use to address some of the pressing challenges related to
college access and affordability in their states.
Case is Assistant Director for Policy Analytics with the California State
University, Office of the Chancellor. The opinions expressed in this blog are
those of the author and do not necessarily reflect the official policy or
position of the California State University, the State Higher Education
Executive Officers Association, or any other organization.
The Organisation for Economic Co-operation and Development (OECD) is currently conducting a multinational study with the primary goals of (1) assessing the performance of higher education systems in developing graduates with relevant, durable, and transferable skills that are understood and trusted by employers, and (2) identifying policy choices for governments that can improve the capacity of their higher education systems to anticipate, develop, and clearly signal market relevant skills. The U.S. segment of the study is supported by funding from the Lumina Foundation. Washington was selected as one of the states representing the U.S., along with Virginia, Ohio, and Texas. The Washington Student Achievement Council (WSAC), as the state partner, has been collaborating closely with the OECD team on research activities in the state.
About the OECD and WSAC
The OECD, headquartered in Paris, France, is an international
organization of 36 member countries in Europe, East Asia, and the Americas, including
the United States. Its mission is to promote policies to improve the economic
and social well-being of people in its member countries. The OECD does this by
providing a forum in which senior officials from member countries meet to
explore shared concerns, discuss good practices, and seek solutions to common
problems. The OECD Directorate for Education and Skills provides evidence and
policy advice to member governments that draw upon its international network of
education statistics, large-scale surveys, and assessments it has developed in
collaboration with national authorities, as well as in-depth reviews of
national government policies and educational practices.
WSAC is a nine-member council appointed by the governor of Washington and supported by a cabinet-level state agency. The council provides strategic planning, oversight, advocacy, program administration, and policy research and analysis to promote increased student success and higher levels of educational attainment in Washington.
Aims of the study
Across the OECD, higher education institutions aim to
cultivate in their graduates the skills and knowledge they need to succeed in
the labor market, and many higher education graduates do achieve rewarding and
well-paid careers. However, worrying numbers of graduates have difficulty
obtaining jobs that correspond with their academic training and qualifications,
and others are employed performing tasks that do not draw upon graduate-level
skills. Where graduates borrow to finance their studies, some struggle to
service the debt they have taken on. In some higher education systems,
graduates frequently decide they must resume education and training to seek
qualifications better suited to labor market demands than those they initially
obtained. In virtually all higher education systems, educators and
administrators are struggling to anticipate and understand the future of work,
and to grasp what this implies for what and how they teach and the skills their
graduates should possess.
Employers, for their part, often report that it is difficult
to find suitable numbers of graduates prepared to enter high-demand professions
and emerging occupations, or graduates with socio-emotional skills that
complement their specialist knowledge.
The OECD’s multiyear project on the labor market relevance
and outcomes of higher education aims to help governments to:
Assess the performance of higher education systems in developing graduates with relevant, durable, and transferable skills that are understood and trusted by labor market actors;
Diagnose the difficulties higher education systems face in developing and signaling skills; and
Identify policy choices for governments that can improve the capacity of their higher education systems to anticipate, develop, and signal market relevant skills.
The Research Approach
Phase 1:Review of state data trends, policy and
planning documents, research reports, and policy analyses. From December
2018 to April 2019, the WSAC team collaborated closely with OECD researchers to:
Develop a broad overview of Washington’s state
higher education governance structure and the relationships among various
relevant state agencies and community, business, industry, and non-profit organization
Understand the full range of policy and planning
efforts by Washington state agencies to align the higher education system with
state workforce needs and to connect students and working adults with relevant
career education and training.
Identify key research reports and policy
analyses to illuminate data trends related to the labor market outcomes of
Washington’s higher education system.
Phase 2: Stakeholder
interviews and policy workshop. From April 29 to May 7, 2019, the OECD team
visited the state of Washington to conduct a series of interviews and a policy
workshop with key stakeholders to gain a range of different perspectives on
issues highlighted in reports. The WSAC research team identified key stakeholders
and coordinated the scheduling of interviews in different regions of the
Interviews included legislators, state-level public officials, higher
education leaders and staff, employers, industry and labor, and nonprofit
organizations. Discussions during the interviews were structured but were generally
free-flowing and flexible, allowing the participants to focus on topics they thought
were key to the issue or ones that had not received as much attention as they
Discussion questions ranged from broad-based to more specific,
depending on what sector the participants were representing. For example, all
participants were asked to consider the broad-based questions:
From your perspective, what is the experience of recent
higher education graduates when they enter and begin to progress in the labor
What are the main factors driving these labor market
were also asked to consider questions focused on issues relevant to the sectors
they represent. For example:
Legislators and state-level agency officials
were asked to consider whether there are processes that support or hinder the
ability of state higher education policymakers to take into account labor
market relevance in their work.
Higher education institutional leaders were
asked to consider what practices their campuses have developed or planned to
foster labor market relevant skills.
The Policy Workshop was held on
May 7, 2019. The WSAC team identified participants for the workshop and
coordinated the meeting arrangements. The workshop discussions were facilitated
by the OECD team. A diverse group of stakeholders were invited, including state-level
policymakers and agencies, higher education leaders, and representatives from
business, labor, regional workforce and economic development, and non-profit
organizations. The workshop was organized around small group discussions among
stakeholders with report outs. Key topics included identifying:
in Washington affecting student labor market outcomes and the alignment of higher
education with evolving workforce demands and opportunities.
Policy options for performance
improvement, focusing on
concrete actions that can be prioritized on the basis of their potential impact
The different roles that
stakeholder groups play in developing and implementing these actions, including
how state-level policymakers can support, facilitate, or coordinate these
Phase 3: Preparation of Report. The Washington state report will be incorporated into a U.S. report,
including findings and recommendations for the four states participating in the
study. The U.S. report is planned to be completed in the spring of 2020.
U.S. findings will
then be used to inform a subsequent international report, which will draw comparative
analyses and policy advice across OECD member countries participating in the
labor market relevance and outcomes project.
OECD Washington Review Team
is an analyst at the OECD and the project lead for the review of labor market
relevance and outcomes of higher education in the United States. A native of
France, Patricia has worked for many years in higher education and employment
policy in Canada and earned an MA/MSc degree in political science and international
relations from the University of Toronto.
is an analyst at the OECD, with a background in research and analysis across
multiple policy areas. A native of Norway, Monica lived in the United States
for many years and took her MPP degree at Georgetown University in Washington,
Weko supports the project as the head
of the higher education policy team at the OECD. Previously, Thomas was the associate
commissioner at the National Center for Education Statistics. Thomas is a U.S.
national and earned a Ph.D. in political science and government from the
University of Minnesota.
Kwakye is director of research at the Washington Student Achievement
Council. He has conducted research in many different policy areas, including
education, welfare/income assistance, and health initiatives in Canada and the
United States. He has a master’s degree in applied economics, an MBA, and a
Bachelor of Science degree in mathematics.
Daryl Monear is associate
director of research at the Washington Student Achievement Council, focusing on
education policy and workforce alignment issues. He has a Ph.D. in educational leadership
and policy studies from the University of Washington.
SHEEO’s Communities of Practice project is focused on increasing the capacity and utilization of state postsecondary data systems and providing a forum for states to work on solutions to common issues with those systems. Through the Communities of Practice project, SHEEO is developing an ongoing network of state postsecondary data users by providing opportunities for members to share information, analyze solutions, and provide assistance to practitioners in other states. Over the past two years, SHEEO has hosted four Communities of Practice Convenings with 35 states represented.
This blog post outlines several data sources researchers and
policy analysts can use to study student loan debt. It covers many of the
standard sources to introduce readers who may be unfamiliar with the data
landscape. But it also includes some less well-known sources that even the more
experienced researcher might find useful for exploring.
For example, we just learned the Federal Reserve’s Consumer Credit
Explorer reports state-level student loan debt based on credit panel data.
This resource shows what proportion of credit holders carry student loan debt,
their median debt levels, and the percent who are severely delinquent. Data can
be disaggregated to metropolitan statistical areas, age groups, credit score
groups, and neighborhood income groups.
It allows us to see, for example, that the median student
loan debt for credit card holders in Wisconsin is $17,323 and has stayed
relatively flat for the past several years.
As with any data source, there are caveats, trade-offs, and
limitations that need to be considered and explained. For example, the credit
panel data do not link to enrollment records, so we do not see how debt and
delinquency vary for college completers versus non-completers – something
policymakers are increasingly concerned about. However, these data are
generated from credit reports and made available each quarter, giving a timely
and high-quality glimpse into the aggregate stock of student loan debt. In
addition to this resource, the Federal Reserve’s quarterly Report on
Household Debt and Credit includes additional aggregates (some at the state
level) for outstanding student loan debt.
With this example in mind, the rest of this post walks
through various student loan data sources that either take a snap-shot at a
single point in time or that follow individuals over several years. It also
includes aggregate student loan data at the college level that can be linked to
various other data sources. In all cases, our hope is these resources can help
researchers and policy analysts pinpoint problems facing the student loan
system while imaging better ways to collect and distribute student loan data.
With any luck, using these sources can help contribute to ongoing efforts to
understand and improve student loans for current and future borrowers.
National Postsecondary Student Aid Study (NPSAS). Beginning
in the 1986-87 academic year, the NPSAS survey has
collected extensive data on college students’ financial aid packages,
demographics, academics, and educational expectations. The earlier surveys were
conducted every three years: 1987; 1990; 1993; and 1996. It then went to a
four-year survey cycle for the years: 2000; 2004; 2008; 2012; and 2016. Beginning
NPSAS will collect administrative financial aid records (but not surveys) on a
two-year cycle to provide more recent estimates of national and, in some cases,
state representative samples.
Unlike most other surveys outlined here, NPSAS is linked to administrative
records from the National Student
Loan Data System (NSLDS), resulting in accurate and up-to-date estimates of
student loan debt. The NSLDS contains
transactional data on federal Title IV loan programs including borrowers’
original principal balance, current repayment plan and loan status, along with
outstanding principal and interest. Borrowers and their loan servicers can
access NSLDS to monitor and manage their loans. Colleges and state agencies can
access this to monitor and report federal loan data (e.g., calculating CDRs).
NSLDS is not currently designed to be a research database, though it is
possible to link to some records through research partnerships or via U.S.
Department of Education surveys like NPSAS. With NPSAS, a researcher can
examine student loan borrowing patterns during college. What proportion of
students borrow? How much do they borrow? How do borrowing patterns vary by
sector or students’ race/ethnicity, family income, or choice of major?
Survey of Consumer Finances (SCF). The Federal
Reserve Board’s SCF
surveys families on their net worth, income, and other financial information
every three years. Beginning in 1989, the survey started asking questions about the amount
of debt students borrowed and still owe. In 2016, it added questions about
participation in income-driven repayment. This survey allows researchers to
link self-reported student loan debt with a wide range of socioeconomic,
financial, and demographic characteristics of borrowers. But when researchers
linked SCF data to credit panel records, they found
SCF respondents under-estimated their loan balances by 25%, and this gap may be
over time. Nevertheless, the SCF is one of the very few data sources linking
debt to income and wealth over multiple time periods.
Survey of Household Economics and Decisionmaking (SHED).
Since 2013, the Federal Reserve Board’s SHED
has been used to produce the Report on the Economic Wellbeing of U.S.
Households. The survey focuses on a range
of topics including credit access, retirement plans, economic fragility, and
student loan debt. Its student loan questions focus on how much students
borrowed for their education and whether they are behind on student loan
payments. These data could be used to triangulate other self-reported sources
while investigating the unique role student loans play in respondents’ overall
Consumer Expenditure Survey (CE). The Bureau of Labor
Statistics’ CE survey uses
interviews and diaries to gather information about how respondents spend their
money. The survey includes a wide range of demographic, income, and background
details of the respondent and the weights from these expenditure items are used to calculate the Consumer
Price Index. The survey has a long history dating to the
early 1900s, but it wasn’t until 2013 that it began asking respondents about whether
they borrowed student loans
and, if so, how much they now owe. These data are now reported annually where
researchers can examine how expenditures on loans correlate with other
household expenditure items such as mortgage payment, car payments, and other detailed
National Financial Capability Study (NFCS). The FIRA
Investor Education Foundation sponsors the NFCS, a national
survey measuring respondents’ perceptions and attitudes toward personal
finance. Their data
page includes state-level responses and their 2012 and 2015 surveys include
questions about student loan debt and repayment.
National Center for Education Statistics (NCES)
Longitudinal Studies of High Schoolers. The U.S. Department of Education
National Center for Education Statistics (NCES) has a series of longitudinal
studies following up with students from high school into college and the labor
force. The National Longitudinal Study of 1972 (NLS-72) surveyed 12th
graders in 1972 and followed up with them five times until 1986. The 1986 follow-up asked students
whether they received loans to pay for college and, if so, how much. Similar
questions were included in subsequent longitudinal studies. High School &
Beyond (HS&B) surveyed
10th and 12th graders in 1980 and followed up multiple
times until 1993. This survey linked to federal loan records, where respondents
include information about whether and how much federal loan debt they borrowed
since high school. The subsequent longitudinal studies are also linked to
federal loan records, where the National Education Longitudinal Study of 1988 (NELS:88)
surveyed 8th graders in 1988 and followed up with them several times
until 2000, eight years after high school. The next NCES survey, Education
Longitudinal Study of 2002 (ELS:02),
followed 10th graders in 2002 until 2012, eight years after high
school. The most recent NCES longitudinal survey, the High School Longitudinal
Study of 2009 (HSLS:09),
surveyed 9th graders in 2009 and has now followed up with them until
2016, three years after high school. Across these surveys, it is possible to
measure the factors associated with borrowing, how much, and whether these
patterns have changed since the 1970s. To my knowledge, this comprehensive
analysis has not been done and would be a ripe area for research to make full
use of existing student loan data.
National Center for Education Statistics (NCES)
Longitudinal Studies of College Students. In addition to these surveys of
high school students into young adulthood, NCES also has two postsecondary
surveys following up with college students. The cross-sectional NPSAS survey
spawns two longitudinal studies: Beginning Postsecondary Students (BPS) and Baccalaureate
and Beyond (B&B).
Both surveys are linked to NSLDS loan records where BPS follows up with first-year students throughout their
college years and immediately into the labor force while B&B follows graduating seniors after college.
The current BPS follows up with the
entering class of 2012 in 2014 and 2017. Prior BPS surveys followed up with the
1996 and 2004 cohorts of beginners for six years, and the 1990 survey followed
up with students four years later. Many student loan problems
are concentrated among borrowers who begin, but do not complete college.
Because of this, the BPS survey is well suited for following up with students
who leave college with debt and no degree. To measure long-term loan outcomes,
rather than just six years after entry, NCES created a federal aid
supplement showing loan repayment outcomes 20 and 12 years later for the
1996 and 2004 entering classes, respectively.
The current B&B survey follows
up with the 2008 cohort of graduating seniors in 2012 and 2018. The first
B&B followed up with the cohort of 1993 graduating seniors in 1994, 1997,
and 2003. The second B&B did not follow graduating seniors for ten years,
as other B&B surveys do; instead, it only followed up with the 2000 cohort
of graduating seniors to 2001. The next B&B survey will follow up with the
graduating class of 2016 in 2017, 2020, and 2026. Importantly, the B&B
study selects only those students who made it to graduation, so any analysis of
student loan debt should bear in mind that students who graduate with debt may
be quite different than those who do not, making BPS and B&B useful for
comparing the experiences of completers and non-completers.
Panel Study of Income Dynamics (PSID). Beginning in
1968, the PSID
collected data from over 18,000 individuals living in 5,000 families. As those
families grew, the PSID began collecting data not only on the original
respondents but their decedents’ families. The survey now includes over 10,000
families and 24,000 individuals and is the world’s longest-running
nationally representative household panel survey. Starting in 2005 and every
odd year after that, it asks whether and how much respondents borrowed for
college in its Transition into Adulthood Supplement (TAS) survey. TAS includes
young adults between 18 and 28 years old and researchers can explore factors
associated with borrowing before college entry, during college, and in
respondents’ early years after leaving college.
National Longitudinal Surveys of Youth (NLSY). The
Bureau of Labor Statistics has two surveys following up with young adults over
the early life-course: the National Longitudinal Survey of Youth: 1979 (NLSY79) and the National
Longitudinal Survey of Youth: 1997 (NLSY97).
NLSY79 cohort included nearly 13,000 respondents between the ages of 14 and 22
in 1979; these individuals were between the ages of 51 and 60 in the 2016
follow-up survey. The original
NLSY97 cohort included nearly 9,000 individuals between the ages of 12 and 18
in 1997; these individuals were between the ages of 30 and 36 in the 2016
follow-up survey. In both surveys, respondents are asked to self-report whether
and how much money they borrowed for college.
Longitudinal Study of American Youth (LSAY). In 1986,
the LSAY sampled a cohort
of approximately 5,000 7th and 10th grade students and
followed up with them for seven years. Then in 2007, LSAY resumed its annual
surveys of this same cohort, following up with them until 2011. This survey
asks respondents whether they ever borrowed money to pay for college and, if
so, how much they currently owe (when they are in their late 30s). The survey
also asks respondents to distinguish between undergraduate and graduate school
debt, as well as other questions like whether their student loans interfered
with their efforts to buy a home.
Integrated Postsecondary Education Data System (IPEDS).
The 1998 amendments to the Higher Education Act (HEA) mandated NCES to report the
number of full-time, first-time degree/certificate-seeking undergraduates
awarded student loans for all colleges and universities participating in
federal Title IV programs. Beginning in 2000, these data are reported
annually. The 2008 HEA reauthorization expanded annual reporting (beginning in
2009) to include the number, percent, and average amount of all federal loans
awarded to undergraduates, but not graduate students.
Federal Student Aid (FSA) Data Center. The U.S.
Department of Education’s FSA office
provides reports for every college participating in federal Title IV aid
programs. Their annual loan volume reports begin in the 1999-00 award year and,
beginning in 2006-07, these reports are now released every quarter. These files
provide the number and dollar amount of Direct Loan originations and
disbursements for subsidized, unsubsidized (both for undergraduate and graduate
students), Parent PLUS and Grad PLUS loans. The FSA data center also provides
Default Rates (CDRs) by school and program-level Gainful Employment
debt-to-earnings rates, which measures the median annual loan payment amount
for program completers. Finally, default rate data for the Perkins Loan program
are archived here.
College Scorecard. The U.S. Department of Education’s
College Scorecard includes
a wide array of student loan data unavailable in IPEDS or the FSA data center. It
reports federal loan repayment rates for cohorts one, three, five, and seven
years after entering repayment. These rates are disaggregated by the borrower’s
family income, Pell status, dependency status, gender, first-generation status,
and whether or not they completed a degree. The Scorecard also reports median
debt for these same groups (e.g., family income, Pell, completers, etc.) along
with the median debt of completers reported in monthly payments (on a 10-month
amortization plan). In addition to medians, it reports cumulative debt at the
10th, 25th, 75th, and 90th
percentiles, but these are not disaggregated by student characteristics. The
College Scorecard includes CDRs (from FSA) and percent of undergraduate
borrowers (from IPEDS). Institution-level data are available annually from
1996-97 to 2017-18. In May 2019, the College Scorecard released preliminary data
on average and median debt for completers by their field of study and degree
Common Data Set. The Common Data Set is a voluntary survey
sponsored by Peterson’s and available to purchase here. Section H
includes data on the number, percent, and average cumulative principal balance among
graduating seniors who started college at the same institution. The Institute
for College Access & Success’ College In-Sight is a
useful tool for exploring
these figures at the campus level along with other fields from the CDS.
Brookings Institution. The Brookings Institution hosts
a page titled Measuring
Loan Outcomes at Postsecondary Institutions which contains several excel
files of aggregate loan data at the campus level for undergraduate, graduate,
and PLUS borrowers. The data includes mean, median, and 1st-10th
decile debt balances of students entering repayment in FY2009. It also
tabulates the total remaining balance for the cohort by institution for each of
the following five years. This page also provides cohort repayment and default
rates, as well as institutions’ total balance of debt in default as of
FY2010-FY2014 and rates of deferment or forbearance participation. These data
were produced by FSA, but are unavailable on the FSA website.
Center for American Progress. Researchers at the Center
for American Progress obtained previously
unreleased data on default, delinquency, and pay-off rates for more than
4,700 colleges. The data follow the 2012 repayment cohort for five years,
measuring the annual number of borrowers whose loans are paid off, current,
delinquent, in deferment, or in default. It also includes data on the annual
total loan balance for each of these outcomes, offering a proof
of concept for the kind of data loan repayment that can be made available
for other cohorts.
This is our first attempt to pull all these resources together in a single place. We have not used all of these sources in our own work, so our familiarity and expertise is limited. Our job here was to curate a list that we believe can be useful for researchers who are hungry for student loan data but are often constrained by data availability. We are of the mindset that the research community should take full use of these sources and, in so doing, can contribute to productive conversations and actions related to improving data quality and public policy in the service of current and future student loan borrowers. The following table summarizes and provides links to sources outlined in this post. All errors or omissions are ours alone and we appreciate any constructive feedback and insights you’d like to offer if you’ve read this far!
2010, the State Higher Education Executive Officers Association (SHEEO) has
periodically administered the Strong Foundations survey, which documents the
content, structure, and effective use of state postsecondary student unit
record systems (PSURSs). This report highlights the results of the fourth
administration of the survey, conducted in 2018.
exist in an increasingly complex postsecondary data environment, one in which
the interplay between state, federal, and institutional data collections and
policy contexts continues to evolve. Over the past decade, PSURSs have been
greatly influenced by increased linkages between different sources of
administrative data and by the proliferation of state educational attainment
goals. The evolving context notwithstanding, PSURSs remain vital information
resources necessary for states to analyze, understand, and improve their
systems of postsecondary education.
The report includes a detailed discussion of the pervasiveness of PSURSs’ use of benchmark privacy and security practices. These include privacy and security processes (i.e., data governance and physical security), standards (FERPA, state statute, etc.), and practices (destroying data, employee training).