Last fall the Department of Education announced a pilot project to open up financial aid to students in unaccredited education programs, such as coding bootcamps. Bootcamps and universities that have teamed up to submit proposals for the “Educational Quality through Innovative Partnerships” (EQUIP) program are still waiting for the green light from the department.
In the meantime, industry experts are figuring out the logistics of running a successful EQUIP experiment. Entangled Solutions, a San Francisco-based “innovation agency” for higher education, is out with a report on mitigating conflicts of interest among the different parties that will work together under the program: alternative education providers (bootcamps), traditional higher-ed institutions, and “quality assurance entities” (QAEs), which will evaluate the pilot programs, much like accreditation agencies evaluate colleges and universities.
Conflicts of interest in the EQUIP program could negatively impact student outcomes. Imagine a bootcamp claims that 90 percent of its graduates land a job related to their training within three months of graduating. If the QAE that’s auditing the bootcamp has made financial investments in that program, there’s a clear incentive for the evaluator to support positive claims that could mislead students.
The report from Entangled Solutions provides recommendations for ensuring QAEs remain impartial and keep students’ best interest front-and-center. The Department of Education has offered little guidance on what these new third-party auditors should look like or how they should act, says Michael Horn, co-founder of the Clayton Christensen Institute for Disruptive Innovation and a principal consultant for Entangled Solutions. “QAEs really weren’t thought of very seriously. You don’t see any mention in the guidelines about who can be a QAE.”
The concept of a QAE is similar to how a publicly traded company must have its financial records audited under the Sarbanes-Oxley Act. If an individual is going to invest in a company, the reasoning goes, he or she should be assured that the company’s financial claims are accurate.
Horn says the same holds true for students: If they’re investing in education, they should be able to trust their program’s reported outcomes. Entangled Solutions decided to pull itself out of the running to be a QAE because of conflicts of interest. The firm is building a consulting business to help colleges innovate, meaning it’s already interested in seeing these types of programs succeed and can’t objectively assess them.
In its report, the firm proposes three guidelines for mitigating QAE conflicts of interest:
- The QAE is economically independent of the program it’s evaluating (e.g., the QAE should not be a lender to students in the program).
- The program under evaluation should not be able to influence the QAE by non-economic means (e.g., a QAE should not audit a program where its former employees work and can sway the evaluation).
- There should be an overseer, such as nonprofit, to penalize QAEs that have competing interests or engage in other wrongdoing.
The report suggests that programs rotate QAEs every five years. It also recommends that programs work with two QAEs—a primary one that complies with the above guidelines, and a secondary one that has a less influential, advisory role. Horn says that Entangled Solutions could fall into this latter camp.
Entangled Solutions concludes that its recommendations are a starting point, not an exhaustive list of all potential conflicts of interests. When asked about what entity might make an exemplary QAE, Horn points to consulting and auditing giant Deloitte, public accounting firm Hood & Strong and a partnership between strategy consulting firm Tyton Partners and an accounting firm.
He says he hopes the conflict of interest guidelines influence higher-ed outcome evaluation not only for the EQUIP program, but for higher-ed models in general. The recommendations come as for-profit colleges—Apollo/University of Phoenix, DeVry University, ITT Tech—come under increased fire for making false claims about student outcomes in recruiting materials.
“The ecosystem should shift from measuring inputs to student outcomes” Horn says. “I hope we inform that conversation.”