Making data-informed decisions isn’t for the faint of heart, as many CFOs and other institution leaders can attest. The challenges of implementing a successful process for getting the most out of data are many, and questions can occur at every stage, from those about building a data warehouse to ensuring data integrity to sharing previously confidential unit and department information. Integrating new methods of decision making into an already established hierarchy and culture and calming fears that programs will be eliminated are also frequent concerns, as are creating buy-in about the multifaceted process among cynics and engaging the overall campus community around a comprehensive strategic effort based on analytics.
Despite these obstacles, many institutions realize that they must rely on analytics to support decision making in today’s higher education environment, which is fraught with challenges, such as waning enrollment, concerns about financial viability, and clarion calls for increased student outcomes.
Three research institutions—University of Maryland, College Park; Drexel University, Philadelphia; and Florida International University (FIU), Miami—have set goals and processes for data usage uniquely suited to their campuses and cultures and have been working to overcome the barriers that have cropped up as they move forward. Each institution now finds itself at a different stage of discovery in this new information-based era.
The Cost of Instruction
After attending a NACUBO conference several years ago, leaders at the University of Maryland decided to seize upon the potential of data analytics by implementing a cost-of-education model, says Cynthia Roberts Hale, associate vice president for finance and personnel, office of the senior vice president and provost. “We felt we needed more data and information about the cost of our academic instructional programs,” she explains.
After an initial collaboration with a consulting firm, the institution is now loading the third year of data into the model, which is producing a detailed analysis of what it costs to deliver instruction section by section and major by major on a department-by-department basis. “We can actually understand the full cost of our instructional programs,” she says. “We’ve never had anything close to that kind of information in the past.”
Hale anticipates that this deeper understanding of the university’s data will especially help it as it undertakes the process of creating several new majors. “This is the first time we can honestly understand how much they should cost and how we should resource them. Because this university has worked on a historical budget model for many years, this is revolutionary.”
The data have already been useful when answering queries from legislators and regents about the cost of programs or services. “Several years ago, we put in differential tuition for computer science, business, and engineering,” Hale says. “At the time, our calculations for why we needed to charge more for those programs were rather soft. Today, we can show exactly why it costs more to offer those degrees than some others.”
Still, piecing together the information proved daunting. “It has been harder than we thought to gather these data from a campus of this size and complexity. We had to collect the data department by department to understand what each department’s teaching load was and how much time faculty spent in the classroom, grading papers, and holding office hours. Each department is slightly different. The collection drove the buy-in because people have to believe the data are correct,” she says.
Without a consolidated database, Hale and her four-person team worked with those people across the campus who controlled the various databases to bring all the databases from facilities, registrar, and financial aid into one system. Although this was a large undertaking for the team, the value of data is better realized when a campus is able to harness insights from multiple data sources.
“That was certainly the first challenge, and I wondered if we would actually pull it off,” she says. “The second one was the faculty expectation regarding the level of detail we would build into the model, so they would believe that the data coming out were accurate. Faculty, of course, love to argue with administrators and love to argue over data, so we really had to spend several years making sure we had captured their workload accurately.”
Right now, Hale is trying to ensure that people understand and use the prepared reports. “The deans all have access to their college’s reports,” she says. “We are releasing a preliminary set of reports that shows department by department, program by program, the cost of instruction and the components of the costs. They can see the salary components, the facility costs, and the college overhead. They can now begin to parse where they can achieve efficiencies.”
She admits that some faculty and administrators have been nervous that they will eventually lose resources based on data results. “That’s been a subliminal conversation for some time. We have tried to consistently assure them that our purpose is to allow efficiencies, build new programs with appropriate resources, and respond appropriately to external entities that ask about the cost of instruction. We really don’t have any intention of using this database to take money away from units.”
According to Hale, the University of Maryland’s Associate Vice President for Finance and Chief Financial Officer Paul Dworkis has been a champion and cheerleader of analytics, which he considers essential to the institution’s overarching modernization efforts. “He has been working very closely with those of us in the provost’s office to identify dollars that can now be invested in building up programs and enhancing new programs,” Hale says.
She explains that when the university initially established the school of public health, it was actually underfunded. “Looking at the cost-of-education model, we can see the school really wasn’t resourced adequately. The CFO has been working with us to identify resources so that we can create a budget that is adequate to deliver instruction and have a top research program. It has been a very effective partnership.”
Hale points out that higher education is reaching an important moment in history. “These old budget models are not going to serve us well,” she says. “This is one of the efforts we’re making to transition to an organization informed by data. We’re just getting started.”
Data Support Planning
“The first step in a data-driven organization is building a data warehouse, which is a completely invisible effort,” says Mark A. Freeman, vice provost, planning and institutional research, Drexel University, Philadelphia. “The case for a data warehouse is very abstract, whereas the pressures on IT and institutional research are immediate. Many institutions can’t carve out time and energy to build up the resource itself. Drexel has, although it’s been slow. We have to balance what is realistic versus what is ideal.”
Freeman reports that Drexel currently has robust data in a wide range of areas, including HR, finance, student enrollment, and surveys. The information is used to support the institution’s strategic planning, as well as its 14 colleges and more than 100 departments—all of which have their own data needs.
For example, the data has proven essential when developing academic pro-grams and evaluating whether current programs are successful from the point of view of institutionwide priorities. “Adding data to these conversations, which have been happening for decades, is new,” Freeman explains. “We spend a lot of time making sure data’s role in these strategic conversations is productive and helps to inform decisions in the context of preexisting rubrics people have in place for how they make decisions.”
Freeman believes the biggest challenge to data-informed decision making is organizational. “Higher ed has its own culture of how we make decisions. One of them goes by the name of HiPPO or the ‘highest paid person’s opinion.’ Traditional authority determines what the decision is. Because we have so many stakeholders, consensus is another challenge. We have this implicit idea that balancing the views of all stakeholders will result in the best decision.”
Sometimes, he says, newly acquired data may indicate that a poor decision was made in the past or that resources were mismanaged. “If you’re too concerned about stepping on toes, it can pervert the data you are allowed to share. Data don’t care who is in authority. Data don’t care about consensus. Data don’t care about protected domains and silos. Data cut through all of that.”
Just adding data and analysis, no matter how well-informed, to a preexisting organizational culture does not necessarily create data-informed decisions, he advises. “We need to be intentional about the ways that data cut across the rubrics of decision making. We have to accommodate the information, which requires changes to the ways we think and behave. It’s not enough to have analysts sitting at the table.”
Using data ranging from student surveys to expenses and revenues, Drexel is developing a rational, objective, and quantifiable tool for defining a successful academic program. But fully leveraging the power of these data will mean reenvisioning traditional processes. “We have this great tool, but how does it actually inform decision making around programs that need more investment or those that might be overfunded?” he asks. “There is an existing process for how you do this through the faculty senate that has some data elements, but they aren’t the same data elements. This means our governance processes would have to be revised if, in fact, we are analytically driven in how we make these decisions.”
Another challenge, Freeman says, is the fear that data will render as irrelevant or redundant the very governance structures, processes, and people currently responsible for decision making and resource allocation. “Data don’t make decisions,” he emphasizes, “people do. Data are a tool, and an imperfect tool. We still need humans to fill in the gaps. We can’t get hung up on whether the data will make the decisions for us. They merely support our decision making.”
Engaging the Campus Community
In 2014, when Kenneth G. Furton assumed the position of provost and chief operating officer, and executive vice president of academic affairs for FIU, he advocated for the development of a strong data analytics program.
“Data come first,” Furton says. “I realized early on that we had multiple sources of housing data and not a lot of trust in the department then called institutional research. We created an office called AIM (Analysis and Information Management) and a website called accountability.fiu.edu. All of the data now reside at this one site. Our first step was to get the democratization of the data and confidence that the data were accurate.”
Next, FIU identified 20 critical performance indicators relating to resources, research, and student success—with a special emphasis on the latter. “We had targets we wanted to reach from 2015 to 2020,” Furton says. “The goal was to track our progress and determine if we were on target, and if not, to determine what we needed to change. The student success side boils down to four key components: finding ways to improve retention, keeping students on track to avoid excess hours, achieving on-time graduation, and helping students get employed.”
When the data were decentralized and kept by individual units, determining accuracy and overall graduation rates was difficult, according to Pablo Ortiz, vice president, regional academic locations and institutional development, FIU.
“By organizing our data management and data analytics, we’ve been able to understand where our students are along their academic pathways, what their needs are, their success rates, gaps that remain, and how we can intervene,” Ortiz adds. “Organizing around this effort created a shared accountability environment for the university. We understood that we could not solely depend on the units to carry the weight of improving our performance with graduation rates. We needed a well-synchronized and supported effort from both central support units as well as the full college or department levels.”
The FIU website now includes graduation rates, student evaluations of faculty, and pass rates of individual classes. “Everything is very transparent,” Furton emphasizes. “No data are kept hidden.”
FIU also initiated leadership meetings, called Communication Protocol for Accountability and Strategic Support (ComPASS), that engage the campus community and serve as the cornerstone of managing and employing the data. The ComPASS meetings bring together about 150 individuals, including the president; provost; chief financial officer; all vice presidents; the deans of the colleges and their strategic liaisons in each of their units along with chairs; and all of the central supporting unit leads, including diversity, information technology, and analytics information.
“Decision makers come together three times a year to look closely at our performance, using analytics to help us make informed decisions based on the data,” says Ortiz, who coordinates the ComPASS process. “With that information, we’ve been able to improve our ability to collect the data, interpret it and analyze it with some of our central units, and disperse it out to individual units so that they can make decisions as well in an effort to improve student performance.
“This process has allowed us to embark on a journey to create much more community engagement in how we as an institution can improve our behavior and strategies to support our students,” Ortiz continues. “In a typical institution, you may have faculty or chairs who may be somewhat disconnected from the organization’s priorities and the resources available to help them succeed. Bringing everyone together has allowed much more clarity about the institution’s priorities, goals, and expectations.”
During the all-day meetings, the provost sits on one side of the president and the CFO sits on the other. “We are the ones who primarily ask questions,” Furton says. “We can drill into the data through the dashboards. If we expected a college’s graduation rates to go up by 4 percentage points and it only went up 2 percentage points, we can drill down into an individual course to look at passing rates and discuss why. The beauty of the process is everybody in decision making is there so there’s no place to pass the buck. Usually, decisions are made on the spot if we need to change a process or fund a program.”
Furton admits that when he was a department chair and a dean, he had no idea what his graduation or employment rates were. “We thought our role was to create great programs and hire great faculty and publish great research,” he says. “Now, I ask deans, ‘What will be your graduation rate next year?’ If they don’t know, they better have an answer next time.”
Furton and Ortiz agree that the next step in FIU’s data journey will be predictive. “In the past, we have been reactive,” Furton says. “We are just beginning to use predictive analytics. The next stage is to get more advanced in predicting what’s going on, so we can deploy centralized solutions to the unit levels.”
MARGO VANOVER PORTER, Locust Grove, Va., covers higher education business issues for Business Officer.