Members Sign In

News & Blog

Metrics can play a significant role when evaluating success of partnerships

Excerpted from the May 2021 issue of University-Industry Engagement Advisor. UIDP members can view the entire issue here.

Can you “measure” the success of your U-I partnership? To a degree, you certainly can, say the experts. Universities and their industry partners employ a variety of metrics to help determine the strength of existing relationships and the projects in which they’ve collaborated. Quantitative measures can indicate the relative success of sponsored research projects, student hires by partners, capstone project results, satisfaction among industry partners, faculty and students, and much, much more.

“Metrics provide great and useful data point parameters,” said Robert Garces, head strategic innovation (U.S.) with EMD Serono, a panelist in the “Metrics for Evaluating U-I Partnerships” session held during UIDP Virtual 2021.

“We use a CRM and a dashboard; we measure what’s black and white,” adds Jack Ellenberg, associate vice president in the Office of Corporate Partnerships and Strategic Initiatives at Clemson University.

However, they caution, metrics will never tell the entire story of a partnership. Other factors, such as the strength of personal relationships between partners, are also critical in making those determinations — and far more difficult to measure.

“There are additional factors, like how responsive is the other side to questions, [what is their] feedback, how good is our project fit to what they perceive their needs are. If they do not really fit, it will probably not be a good partnership,” noted Garces.

“To me, people are the critical part,” adds Ellenberg. “We have tangible guidelines that people can understand, but it all comes down to communicating freely as a partnership.”

The type of projects involved can also impact how much you rely on quantitative metrics, added Spike Narayan, director of science and technology for IBM. “Metrics are colored by the goal of exploratory research,” said Narayan, who emphasized that he looks more at strategic initiatives than tactical programs. “In tactical projects there are a lot of milestones and deliverables, good measures of success that are easy to define.” But in exploratory projects, it’s much harder to quantify and consequently measure. “It’s nice to tie investment dollars to ROI, but not necessarily easy. The size of exploratory investment helps define the role of metrics.”

At the University of North Carolina, “evaluation has been very informal — a lot of personal interviews and data reviews,” says Joonhyung Cho, director of business development, Industry Relations.

And for Anne O’Donnell, senior executive director of development in UC San Diego’s Corporate Relations Office, the key consideration is not necessarily the metrics you use, but how they are determined. “Let’s agree ahead of time how we’re going to evaluate ourselves — what will be the metrics for success,” she shares. “And it can’t just be dollars; we’ve brainstormed some intangible metrics.”

Choose your metrics

Naturally, metrics vary from university to university and company to company, depending on the type of projects they’re involved in, their mission, and their partners. For example, Clemson takes its cues from the strategic engagement portfolio it developed in 2017. Key areas include student involvement, academic programs to engage industry, research competencies, aligning with industry partners in research, and economic development including potential campus footprints.

“We created a strategic steering committee,” says Ellenberg. “For every partnership we put together, a university team is mirrored by a company team — two to three on each side — who talk on a regular basis about what holes exist, and how we can plug the holes. We look at our partners’ investments in Clemson; we have to show their ROI and what the ROI is for Clemson.”

If the company engages students, using data from a customized version of Salesforce the university will know the number of co-ops and internships created, and how many students were hired by the partners. “We know [how many] students enroll in certificate programs driven by industry,” he adds. “For sponsored research, we have the value. If they do have a physical presence, we know the capital investment they make.” Those data, he adds, are made available to the university board of trustees and to the corporate partners.

(A number of engagement professionals use tools like these to help evaluate their performance with industry. A recent one-minute informal survey by UIDP showed that 54% of respondents use such tools to evaluate research performance, and 72% used them to determine the match between industry needs and academic expertise. See Figures 1 and 2 below.)

Figure 1

chart with blue bars

Figure 2

Ellenberg sees great value in having hard numbers. “It helps us from a predictive standpoint,” he notes. “Look at resources. It goes without saying that if a majority of our partners focus on one or two departments it allows us to look at that and determine the resources that need to be made available to support them. And from a research standpoint, where are our industry partners leaning — do they focus on one college, one department, one faculty member?”

When he looks across the landscape, he continues, he can compare the partners to each other — how many are in co-ops, research, and so on. “But that’s where the comparison stops, because every company is unique, even if within the same industry sector,” he says. “Relationships are not a cookie-cutter approach. What works for company A may not work for company B. It’s not fair to your partners to do a comparison with others, as much as it is to glean [information] on why they’re focused on certain areas.”

Garces said he looks at the number of relevant publications, known grants, patent applications, and so on. “The challenge to me is the number of parameters we use and look at, but weighting depends on the project, and internal constraints,” he adds. “What are we trying to fulfill in the project? What do we need? And based on the capabilities or abilities of the partner, do they meet the criteria?”

Strategic vs. transactional metrics

Smaller deals, noted Narayan, result in traditional metrics like publications, conference talks and papers, “but I argue it’s been hard to get value from them and measure ROI for the company.”

You get more sense of value, he continued, from an extended strategic viewpoint, although doing that, he admits, is a hard task. “One of the best ways is to have the student spend a significant amount of time on [the IBM] campus,” he noted, because the talent pipeline is getting more and more important. Measuring how many students are hired, for example, “forces us to look more broadly at schools, and changes the way in which we scout. It will change the landscape.”

O’Donnell re-emphasizes that the metrics have to be established jointly with the corporate partner. “I know mine [metrics]; I need to know what motivates the other person,” she comments.

After all, she continues, metrics are part of an overall agreement. “Clarify what goals you need for each other,” says O’Donnell. “Like how a pipeline needs to be built. I may want you to start hiring, but [the] partner says, ‘Let’s look at four years.’ The university wants funds, the company wants ROI. Listen for the metrics in the voice of your corporate partner.”

From her point of view, “citations are an important metric,” she says. “You can also look at expenditures, the number of students who got jobs, revenue and investment.”

The personal approach

Input from your industry partner may not always be in the form of “hard” metrics like surveys or student hires, but that input is nonetheless of great importance when evaluating a partnership.

For Cho, that means personal interviews. “The first question I always ask is, ‘How do you feel our interaction has been?’” he shares. “I always try to understand their definition of the partnership they are part of. As someone who put the framework together, I have a 50,000-foot vision of it and where it should go; their view may be very different. I try not to correct them, but to understand where they’re coming from.”

One downside of this personal approach, he notes, is that responses by nature can be based on personal experience, and not system-level. “But when I interview 10 people, I can understand consistent issues,” he adds. Cho says he may do these once a year, depending on the state of the partnership.

When it comes to evaluating specific projects, he continues, “we still don’t use any tools so to speak.” Rather, he says, he will interview faculty for post-evaluation. “We try to understand how the interactions with the corporate partner have been,” he explains. “Did you drive it, or did you feel the company drove it? When it came to students going to a conference or publishing a paper, did you feel they were supported by the company, or did you feel the company was irritated by your published papers? Did the students feel intimidated by the corporate partner’s science presence, or did they feel a positive relationship?”

Cho says he has found personal interviews to be an effective tool. “We have done surveys with faculty by e-mail, but we did not have a good participation rate, so we started doing personal interviews,” he explains. “We want to make sure they had a good experience and feel they helped move the science forward.” As for students, “we want to make sure they’ll be faculty ambassadors, talking to other faculty. At the end of the day, we’re trying to change the culture, to help them appreciate partnership in general and what it’s like to work across the aisle,” said Cho.

UC San Diego’s O’Donnell says she does gather data on the satisfaction of corporate affiliates, seeking to get feedback on what worked and what didn’t — and whether the program met the partner’s needs in general. “It’s very important that the corporate partner feels they have a voice in what’s happening,” says O’Donnell. In board meetings, and through surveys, she has asked participants what was most valuable to them. “We brainstorm five to 10 minutes at all board meetings,” she says. “You weigh in to buy in.”

There is, she continues, no easy answer on evaluating partnerships. “The relationship with every company is relative to that company,” she says. “The structure we put in place allows not just for free communications, but to ask the questions. ‘These objectives — have we met them? Are we on course? Should we alter course?’”

Still, O’Donnell reiterates, what you do up front with your partner goes a long way towards determining what those responses will be.

“We do brain storming; ‘What are your expectations?’” she shares. “’What are your most painful processes? Your easiest?’ They may be different from mine, and by understanding that you can manage expectations.”

At Clemson “we don’t use formal satisfaction surveys,” Ellenberg reports. “I would tell you our conversations are so frequent, and our partners so comfortable with us, that we do not have to go through a formal process. In every piece of the engagement portfolio, we get constant feedback; that’s what strength in partnership depends on.”

Clemson works with its industry partners on a three- to five-year strategic planning approach. “Our account managers talk with their counterparts every day, or every week,” he says. “The steering committee is a higher up group, which may meet once a quarter or a couple of times a year to review the partnership. Then, once a year we go back to our partners with an annual report. When we send the annual report, we follow up with a full debrief with the company, which allows us to talk strategy for the next year.”

People aren’t numbers

Despite their value, say observers, “hard” metrics have their limitations. “In the end, the most successful partnership depends on people — and people are hard to quantify,” said Garces. “Even with things as simple as looking at how many minutes, hours, or days, partners can either take the greenlight and be extremely engaged, or not really want to bother with us.

“It’s very hard to measure ROI,” he continued, “but if you can actually have that level of engagement, come up with new ideas, or quickly exclude targets so as to not chase red herrings, you have all the best. If people respond to us, get on the calls, go face to face, [you should] always listen carefully to what appears to be their driving motivation.”

“Approach business as business — ROI,” adds Ellenberg. “I know they approach Clemson looking for ROI and we recognize that. Still, anyone can develop tools, but creating that model of interaction is critical.”

Or, as Garces states, “a lot of metrics are great, but whether you’ll be a success really depends on the people.”

Contact Cho at 919-843-3315 or joonhyung.cho@unc.edu; Ellenberg at 803-737-0690 or ellenbe@clemson.edu; Garces at 800-283-8088; Narayan at 408-927-2405 or narayanc@us.ibm.com; and O’Donnell at 858-229-5963 or odonnell@ucsd.edu. From the University-Industry Engagement Advisory Newsletter, May 2021.

Posted May 13, 2021