Cost-Effectiveness in Delivering a High Return on Investment
Susan Coia Galley
Institutional Research, Assessment and Report Consulting
Cost-Effective and Return On Investment (ROI) are terms with origins in the business sector. The dictionary defines Cost-Effective: “Producing optimal results for the expenditure.” Complementing this definition: “Return on
investment (ROI) is the concept of an investment of some resource yielding a benefit to the investor. A high ROI means the investment gains compare favorably to investment cost. As a performance measure, ROI is used to evaluate the efficiency of an investment or to compare the efficiency of a number of different investments. In purely economic terms, it is one way of considering profits in relation to capital invested.” ROI is increasingly cited in reference to college education. In this era, institutions of higher education must thrive to survive – develop
or die. Leaders must focus on cost-effective means of delivering a high ROI to various constituents, or stakeholder groups – students/families, government, and taxpayers and, philosophically, society and mankind in the vein of maintaining the United States’ position as an intellectual and economic power.
The Government’s Catalyst For Intervention: Cost of Attendance
Though originating in the business sector, ROI has become a buzz term in higher education. Sparking assertions of its application to higher education, and heating the debate over its manifestations – concrete to elusive and amorphous – is the escalating cost of education at institutions of higher education, which is born by students/
families and the government/taxpayer.
The United States Department of Education has featured College Navigator, which is a reviewer-friendly tool that summarizes figures of general interest, such as institution-specific cost of attendance, average financial aid by income level, and student loan default rates (http://nces.ed.gov/collegenavigator/?q= Harvard+University&s=all&id=166027). Somewhat less handy, yet available in The Data Center, is average loan amounts by institution,
disaggregated by loan type (e.g., federal versus the less desirable private) (http://nces.ed.gov/ipeds/datacenter/).
Newspapers occasionally publish articles that contain lists of institutions that are rank ordered by cost of attendance
(sticker price). Frequently cited over the years are statistics that show how much cost of education has outpaced inflation and salaries. Cost is pitted against starting salaries of recent graduates and student/parent loans. Despite federal and state aid to students and to institutions, students/families are bearing more and more of the expense.
The government is turning up the pressure on institutions to hold them accountable for their use of government money and charges to students. Pell recipient is often used as a marker of the nation’s most economically
disadvantaged population; institutions soon might be required to report this graduation rate for accountability, given that the institution receives federal dollars toward the education of these students [usually packaged with loans and an institutional discount].
Historically, the government has been reluctant to police disciplines/fields and institutions, preferring that they show evidence of self-monitoring. Once involved, the government does not shake loose. The government has pitted dollars spent/committed by taxpayers and students/families against their return, hence the term, ROI. The government wants results, not explanations, and President Obama himself is spokesman (Office of the Press Secretary. 2013, August 22. FACT SHEET on the President’s Plan to Make College More Affordable: A Better Bargain for the Middle Class. The White House. Retrieved from http://www.whitehouse.gov/the-press-office/2013/08/22/factsheet-president-s-plan-make-college-more-affordablebetter-bargain-).
Ivory Tower Reaction to the Scoreboard.
Groups avidly address the use of Obama’s College Rating system for colleges that educate high-risk students, a group that is defined by pre-admission characteristics that predict low graduation rates (Goldie Blumenstyk, The Chronicle of Higher Education, May 20, 2014, http://chronicle.com/article/Risk-Adjusted-Metrics/146193/?cid= at&utm_source=at&utm_medium=en). Citing pros and cons of taking risk into consideration when rating colleges, they passionately take their stands. Arguing against the use of a “one size fits all” set of metrics are professional groups such as the Association of Public and Land-Grant Universities and the National Association of Student Financial Aid Administrators. Intimating the Pygmalion Effect, some caution on the danger of setting low expectations (Center for American Progress), while others outright oppose adjustments on expectations as establishing a double standard (Institute for College Access & Success). We academics will be quick to identify a plethora of stipulations and considerations to any set of metrics. If one endorses the stand of adjusting for pre-admission characteristics that affect college outcomes, certainly “the devil is in the details.” Nevertheless, the federal government evidences a propensity toward discrete measures, an aversion to ambiguity, and an allegiance to “informed consent” in decision-making by prospective students and their families. Perhaps the central issue in the debate is “ability to benefit” versus “opportunity,” or “access”. Do we derail some students by offering college instead of trade school for a potentially lucrative and satisfying occupation? We may need to assess “ability to benefit” [from a college education] of applicants to our institutions, assess essential conditions to exercise the ability, and apply our results accordingly.
Fodder For ROI: Convincing Figures Are Mounting and More Visible
In the not so distant past, reporting of higher education issues was confined to higher education literature. Reporting has been spilling over to mainstream media. The government, though slow to react, is becoming more active.
Providing fodder to the government’s offense is technology enabled statistics from national and state databases that are transformed into eye-opening, increasingly digestible information. In the meantime, scholarly, “pro-consumer” organizations and institutes capitalize on the availability of comprehensive databases and the technologies to merge them in order to deliver vivid depictions that are swiftly defining the aspects of ROI. For example, College Measures has spearheaded an Economic Success Metrics Program in partnership with the American Institutes for Research. Open to the public, the website features a reviewer-friendly search tool that shows a breakdown of graduates’ salaries by institution and major field.
The Georgetown Public Policy Institute publishes reviewer friendly tables of median salaries by major for people with
bachelors and graduate degrees in the major (http://cew.georgetown.edu/whatsitworth/).
Through its Employment Projections Program, the Bureau of Labor Statistics publishes occupational outlook data. It displays growing and declining professions and occupations, which fuels articles about the employment prospects and pay for specific jobs that can be tied to major fields of study (http://www.bls.gov/emp/).
Results of the survey by the Accrediting Council for Independent Colleges and Schools are published in which employers rate the workforce skills of recent college graduates that they expect graduates to possess (http://www.acics.org/events/content.aspx?id=4718).
The National Association of Colleges and Employers (NACE) conducts a survey that reports average starting salaries and ranges for graduates in 90 bachelor-level majors in categories that include business, engineering, healthcare, and tech related disciplines – plus robust data for liberal arts and other fields where salary information has historically been scarce. Reports feature: (a) salaries by major; (b) industry and major; and, (c) industry, occupation, and major, so reviewers can determine the going rates. Additionally, there are at-a-glance trends and major-specific information from data that are reported by employers (http://www.naceweb.org/salary-survey-data/).
All the while, the information is falling into the hands of mainstream media with increased frequency. The media need only know where to look, and the government is already looking. Meanwhile, academic discussions and impassioned debates ensue within the walls of higher education regarding the various and appropriate aspects of ROI, how they should be measured, whether they can be measured, and whether they should be measured.
Are College and University Presidents Prepared with an Information Management System to Report ROI and Investigate Cost-Effective Solutions?
The government’s College Rating System is in development and, likely, will continue to evolve after its debut. Are college and university presidents prepared for increasing demands? What is the current capacity of institutions to report, and to conduct cost-effective means of enhancing its own ratings to ensure institutional viability? Many, if not most, institutions are ill equipped, as they lack a robust Information Management System (IMS). Struggling with existing reporting, they are even less prepared to research and assess in-house cost-effective means of impacting ROI. Metrics/outcomes of interest to the government, and proposed associated contingencies for funding, will be prescribed without the wiggle room that regional accreditation has allowed for showing institutional effectiveness.
An IMS encompasses: (a) technical systems and procedures; (b) technical expertise in Information Technology departments (IT); (c) methodology acumen and technology user expertise in functional areas; (d) data dictionaries; (e) cross-functional cooperation in strategically prescribed data management; and, (f) data and information sharing and access practices to meet increasing research, assessment and reporting needs.
Derived from the “Data Warehouse,” a Data Library is a central repository of data files that are research, assessment, and report ready. Data files in a Data Library consist of data elements that are merged from various functional areas of the institution and deemed to have research, assessment and reporting value when captured at particular points in time (at census). These are institutional archives for research, assessment, and reporting.
Standard Operating Procedures (SOPs) are developed for data capture, at which time all data must be “clean.” Data elements are transformed, and data files are structured according to intended use. A project manager in IT spearheads the initial IMS project with the head of institutional research (IR); the latter expert has hands-on experience in statistical research investigation, assessment, and reporting. From the outset, and for on-going development and maintenance, IT’s technical expertise and IR’s knowledge of application are drivers. Information sharing from contributing offices, who also use the Data Library, is essential, as is their adherence to SOPs. This is an IMS – an IMS that adapts to changing and increasingly complex requirements for usable information.
Restated, to ensure institutional viability, each institution must develop its IMS with the capacity to: (a) follow prescribed methodologies for accountability, (b) statistically investigate cost-effectiveness in improving government prescribed outcomes and their measures, and (c) track/assess and report progress. Current systems do not improve with age. Unbeknownst to many leaders, many data analysts fulfill some requests for figures, including ad hoc requests by their presidents, by producing lists on Excel spreadsheets, then highlighting columns and even counting. Other relatively less awkward procedures slow production, increase the chance of error, and deter research that produces sophisticated levels of information on cost-effectiveness in achieving any outcome of interest to the president and/or the government. This “system” will not handle volume. As accountability reporting increases, so, too, must an institution’s IMS, so presidents can report ROI, and investigate cost-effective strategies to improve what they report in the face of limited institutional budgets.
Components of ROI
A variety of components of ROI have been proposed and argued in impassioned discussions over the College Rating System and College Scorecard (Office of the Press Secretary. 2013, August 22. FACT SHEET on the President’s Plan to Make College More Affordable: A Better Bargain for the Middle Class. The White House. Retrieved from http://www.whitehouse.gov/the-press-office/2013/08/22/fact-sheetpresident-s-plan-make-college-more-affordable-betterbargain-) (College Scorecard. College Affordability and Transparency Center. http://www.whitehouse.gov/issues/education/higher-education/college-score-card).
The College Scorecard is an interactive college search tool for students/families that aims to empower students/families with information so that they can make informed decisions regarding value. Reviewer-friendly graphics compare a searched institution with national statistics and institutions with similar missions. Components cover cost of attendance; graduation rates; federal student loan default rates; median federal borrowing amounts and monthly repayment amounts; and, employment. The College Scorecard is linked to an interactive tool for exploring and planning careers and interests, whereby directing students/families college selection to consider life after college (http://www.mynextmove.org/). The government is defining ROI, putting it on the radar of students/families, and establishing criteria accordingly for institutions to receive funds.
In the opinion of many academicians, other critical components of ROI include those associated with learning outcomes (the attainment of program- and course-specific learning objectives), and the ability to be a life-long learner (analytical, critical thinking and problem solving skills, and creativity). In the opinion of many employers, additional measures include the “soft” skills (e.g., writing), which are nurtured in General Education/Arts & Sciences departments and, ideally, reinforced in major programs of study. In the opinion of many leaders and administrators, the components of ROI vary according to the mission of their institution. A pronounced contrast is the career institution [in which students seek employment after graduation] versus the research university [in which many bachelors’ graduates pursue higher degrees] versus the open-access community college [from which many students transfer out to four-year institutions and others never intended to graduate]. (Lederman, D., Stratford, M., & Jaschik, S. 2013, February 7. Rating and Berating the Ratings. Inside Higher Ed. Retrieved from http://www.insidehighered.com/news/2014/02/07/colleges-andanalysts-respond-obama-ratings-proposal).
Nonetheless, presidents must ensure that an IMS is in place for accountability reporting to the government and, importantly, for data analysis to research and assess cost-effective means of improving ROI. Many IMSs are not equipped to handle more reporting, much less use data to conduct statistical research investigations on how to improve results, and assess cost-effective means of doing so.
The Substance of Cost-Effectiveness For An Institution of Higher Education: Observe, Measure, Predict, Control
While an IMS enables analysis, solutions require creativity, resourcefulness and an enterprising spirit. Recall the sayings, “Necessity is the mother of invention” and “Don’t reinvent the wheel.”
At the analysis stage, the components of cost-effectiveness in producing a good ROI/value are institution and student specific, and require data based research to identify. One may develop hypotheses from observation. Remember the experimental research adage, “Science begins with observation,” but then you measure, so you can predict and control/manage outcomes: Observe, measure, predict, and control. Review scholarly studies in higher education literature (http://nces.ed.gov/pubsearch/ ) (Newman, J., 2014, February 21. How Average Net Price Fails to Capture the ‘Best Bang’ for Your Buck. Chronicle of Higher Education. Retrieved from http://chronicle.com/section/Home/5). Build on the current in-house body of research. Test your hypotheses. The institution’s predictors of, or
precursors to, outcomes of interest are the institution’s business enterprise yardsticks.
Data based research investigations that utilize multivariate methods can disaggregate your applicant and student data into applicant and student profiles that predict academic performance, persistence to graduation, student loan default, short-term and long-term employment, and so forth, for any outcome of interest that you operationally define, and for which you collect and enter into your research data base in an appropriately structured and functioning IMS. Applicant and student profiles might consist of statistically significant measures of academic preparation, ability to pay, academic commitment, as well as demographic characteristics (including first generation to attend college) – possibly academic program-specific. When pockets of students are identified, you direct your limited resources accordingly rather than cast a broad net. A suite of customized provisions to defined pockets of students enables you to be cost-effective in achieving outcomes that are important – to the institution or to the government.
Example: Informed by a low probability that, business as usual, a defined group of academically underprepared students with low ability to pay who are first generation college students will attain a desired outcome(s), you can preemptively generate a suite of creative solutions such as, (a) reduced credit load; (b) supplemental programmed instruction to reinforce classroom instruction; (c) online summer courses to manage time to graduation, and enable
summer employment while living at home; (d) on-campus employment during the academic year; (e) a particular amount of grant money; (f) learning objective-specific academic tutoring; (g) financial aid counseling; (h) academic advising on a major program of study that is a good fit with their academic ability and interest; and (i) a peer coach.
Through data based research and creative solutions, you can direct limited institutional resources – time, money and effort – where they are needed and effective, and then assess/monitor and fine-tune accordingly.
With directed allocation of resources, you can streamline and revamp staff according to expertise and demonstrated need. You can determine the best number of full-time faculty that can double as instructor and academic adviser. On-campus jobs can be directed to the most financially and academically needy. Reduced course loads can be prescribed preemptively, based on academic need. In its various forms, blended learning can be directed to where it is an appropriate alternative: On-line courses, competency-based “courses”, supplemental programmed instruction for reinforcement, and learning objective-specific tutoring. Cross-train staff in particular functions to accommodate fluctuating student needs.
The Institutional Scorecard
An institution can develop its own scorecard of cost-effectiveness in delivering a high ROI – to students/alumni, the government, and to itself. The scorecard includes the government’s measures of ROI and the institution’s additional outcomes of interest that represent its values.
The in-house Scorecard can contain four sets of metrics:
1. Metrics for outcomes that constitute ROI – institutional and government specified
2. Research-identified predictors of, or levers that impact, ROI
3. Benchmarks of progress toward outcomes and associated levers (assessment), and
4. Financial yardsticks, such as budgets for functional areas, programs, resources and associated numbers of faculty and staff that are discriminately based on need/use.
Year-to-year persistence/retention is an example of a benchmark for an outcome of interest: Specifically, year-to-year persistence of an entering cohort assesses, or tracks, progress toward graduation. Tracking the status of levers that are associated with progress toward bottom line outcomes (ROI) enables the institution to focus intervention, fine-tune strategy and correct the course.
Red Flags: Tips for the Cunning and Discerning Leader
1. Throwing more people at a problem: This is often a costly complication and futile attempt to circumvent systemic obstacles such as lack of coordination, cooperation, and/or expertise. Diagnose the cause of the problem first, and then act accordingly. (Marcus, J. 2014, February 6. New Analysis Shows Higher Ed Boom in Administrators. Huff Post. Re-trieved from http://www. huffingtonpost.com / 2014/02/06/higher-ed-administrators-growth_n 4738584.html) (Desrochers, D.M. & Kirshstein, R. 2014, February 5. Changing Staffing and Compensation Patterns in Higher Education. American Institutes for Research. Retrieved from http://www.air.org/resource/changing-staffing-and-compensation-patterns-higher-education).
2. Adding more staff: Possibly the leader needs to revamp the system.
3. The magic bullet of assigning, or hiring, one person to shoulder a burden: No man is an island.
4. Figures to back opinions: Higher education has no shortage of people with opinions. Measure the evidence before you act. Question data integrity. No one, not even your data guru, is exempt from being grilled. Burden of proof applies to your data guru, as well.
5. The mystique of data gurus and mystery stashes of data: Opinions cannot be supported without the requisite data. Ask for copies of data dictionaries so you know whether the data even exist in data file structures [for the research, assessment and reporting] to advance your institution’s ability to be cost-effective in delivering a high ROI. If you do not understand the data dictionaries, it is because they are not clear.
6. Three-year history minimum: Look for consistency to sup-port the assertion of a trend (reliability). Understand the dynamics of outliers. Baseline comparisons and references support assertions of progress, or improvement.
7. Black box of forecasting models: Know the multiple elements (that is, predictors or factors) in a forecasting model. Do they have intuitive appeal (face validity)? Ask for a [bivariate] breakdown of each element’s impact on the outcome of interest. Ask about missing data and how it is handled. There should be an investigation of why it is missing, and a statistical study of the impact of missing values on the outcome of interest.
It is beyond the scope of this article to evaluate the government’s College Scorecard and role in higher education governance over which many people/parties have been weighing in. In accordance with your institution’s values, you can develop your own Institutional Scorecard for cost-effectiveness in providing a good ROI to your students and other constituents. Your Scorecard is power-housed by a robust IMS that enables the investigative statistical research, assessment and reporting to inform and support cost-effectiveness in providing a good ROI and strengthen the long-term viability of your institution – the leader’s legacy.