Let’s not pretend that nine conversations is anything like a quantitative survey. But as someone with a recent obsession with academic-industry relations and what works best, I was fortunate enough to have introductions to a variety of industry-side people at the autumn UIDP meeting. I wanted to ask them what works in choosing and executing academic collaborations. Top-down? Bottom-up? Measure everything? Measure nothing?
Below is a digest of their approaches and their comments. It represents only very small number of industries: only those who were at the UIDP meeting (and therefore highly interested in university collaborations) and only those people who were kind enough to speak with me. A broader and more professional survey would likely yield more impressive results. But the responses would probably be just as diverse, and would underline the fact that every approach can work. What seems to be the common thread, is the recognition that industry partners must be prepared to put in staff time and effort toward the strategic development, project execution and high-touch relationship-building necessary for successful university co-creation.
The industries covered included pharmaceutical (2), chemical (2), medical devices, technology, automotive, consumer goods and food/beverage.
The companies varied in their strategic approach to research collaborations. A substantial group (5) defined their collaboration strategy in a top-down manner, deciding centrally ahead of time what they would focus on, and seeking out partners in a systematic way. But there were others (3, several with very mature collaboration programs) who said their partnership strategy was defined at the business unit or primary investigator level, and was not centrally driven. One company said that they had a mixed central-driven and business-unit-driven approach.
Once the strategy is set, who makes the decisions of projects and partners to approach? The project-level decision making was similarly diverse amongst companies. The companies with a central team driving collaboration strategy were also tended to centrally choose the projects to support. Those companies that let their business units or internal primary investigators define the collaboration strategy also let those decentralized units make the choices on which projects to execute.
One of the companies who preferred a centralized strategy decision-making said: “We do the project triage up front, and we assess the value impact to the company: if it’s a good fit as a university, the risk profile, the likelihood of success and if this project is better done internally than externally.”
There was much more variety in what team originated ideas for university-industry collaboration projects. Of the nine companies, one generated project ideas centrally (with some coming externally), three took ideas from the business units, one looked exclusively at external project proposals (and reviewed them centrally) and one used both business unit and external ideas. One company, which also had the mixed centralized and business unit approach to strategy, collected project ideas from everywhere: centrally, from the business units and also from external partners.
One of the companies who prefer the business-unit approach noted: “Being bottom-up and global means we gain balance. If we are going to fail in what we’re doing at a university in a location, it will fail because of the research question. And that has to be developed by our [corporate] researchers.”
Another company’s collaboration officer noted that while they were bottom-up, they would prefer a move to a more strategic approach, saying, “We’d like to know more about institution’s core capabilities, licensing history and willingness to engage in pre-competitive and non-competitive spaces. And it would be helpful to know when universities already have existing partnerships, particularly when they’re with a competitor.”
There did not appear to be any relationship between those companies that chose a top-down or a bottom-up approach in terms of their sector, the company’s age, or the maturity of the company’s collaboration program.
Types of Partnerships
Most of the companies (7) had both a strategic partner approach and an ad hoc approach to research collaborations. Two companies chose to work ad hoc entirely, with no strategic partners, and one of the company used centers of excellence, strategic partnerships and ad hoc partnerships. All of those interviewed said that they did not choose partnerships by their location (i.e. a university close to them), but two thirds of the companies (6) noted that their companies historic ties to a region continued to influence their partnership choices. This was not related to the location (US or rest-of-world) of their headquarters, the age of the company or the maturity of their academic partnership program.
A research officer in a company that prefers to do both strategic partnerships and ad hoc projects (all by application and central review) commented: “In looking for partners, we don’t want to put our finger on the scale. We should be doing projects together because of mutual interest and merit. So we don’t go for centers of excellence, and we don’t give any special treatment.”
The other way we do projects is via signed master agreements with our Alliance Partners. These groups still have to apply to us for funding with their project proposals, but if we do fund them, we have these agreements in place. The Alliance Partner program has its own budget and only Alliance Partners can apply. They can propose their ideas, but we don’t guarantee they’ll be funded.”
Another noted the influence of contracting terms: “Most partners are in the US, but contracting conditions do make a difference. The first priority is merit. But cost and IP are also considerations. If all things were equal, we’d go with the partner with the better terms.”
Types of Collaboration
All nine of the companies currently run research collaborations, in which their teams co-create with academic researchers. That’s not surprising, since they’re all active members of UIDP. But they all felt that co-creation with their own researchers was an important part of their collaboration portfolio.
One research officer noted the amount of time necessary to develop co-working relationships: “We have built up this partnership over the past 8 years – we started small and low key and then built up to where we are today – it takes time!”
The choice to engage in different types of research support was more mixed. Six companies currently collaborate via contract research (one noted that it has only a few of such contracts), and three do not. Three companies choose to give grants (all do so based on an application process), and four do not currently collaborate via grants. A similar proportion of companies do internships, but it was a different mix of companies that chose to work with grants. One of the nine interviewees didn’t comment specifically on types of collaborations beyond research collaborations.
One participant noted the apparent novelty of working in close collaboration with industrial collaborators: “Academics and their students are not used to working with industry. It can be a surprise for them, discovering that scientists in industry and academia are similar.”
As with strategy and partner choices, there was a mixed approach to project execution. Unsurprisingly, companies that have a strongly centralized strategy, also tend to fund projects centrally. Those that allow the business units to choose their academic partnerships also tend to let the business unit or internal primary investigator fund them. Regardless of approach, many mentioned the importance of stability. “A typical tension is industry tying to be nimble and trying to fail faster,” one said. “Academics need durability and predictably for their students. That kind of thing gets negotiated per project. We are not willing to stop funding something and leave a student hanging because of a shift of focus.”
Interestingly, the day-to-day ownership and execution of the projects was more mixed. Only two of the companies that centrally choose and fund projects also execute them centrally. One that based collaborations on bottom-up business needs then chooses to run the projects from a central team. The same company that was mixed in its strategy and project choice was mixed in its approach to project execution. It is a relatively young company with a necessarily younger partnership program.
The people I spoke with on the industry side had lots to say about how they engage with academics. “When meeting a prospective research partner for the first time, we like to present first,” said one of the participants. “We discuss (1) what research topics are of interest to us and (2) why we have a research grants program. We prefer to share our research grants needs and strategy immediately. We tend to be very up front regarding our decision-making process and expectations of research partners.”
Running the project and ensuring university-industry engagement can be done in different teams. After all, they require different skill sets and approaches. Two companies said that the central partnership team was solely responsible for engagement. Three companies noted that engagement with partners was primarily the business unit’s responsibility and four noted that both teams were responsible. Nearly all of the companies specifically called out that they required very frequent academic engagement from their company teams. “We have a very hands-on approach,” one stressed. “We have young employees lead these projects, as it is a great career development point. They get papers and an external reputation. They also get experience in project management.”
Another had advice on inspiring better performance: “If a university researcher is underperforming, we’ll tell him/her. We may also invite him/her to a meeting with other researchers working on similar topics. This way, everyone sees the level at which the others are performing, and it stimulates a bit of academic competition.”
Two of the company representatives said that they have defined multiple metrics for their partnership programs. Unsurprisingly, these companies have long-established academic collaboration programs. Two others said that they had some metrics. The majority (5) noted that they rely mostly on a qualitative approach to assessing the success of their collaborations.
It’s a narrative [to senior management] about the value of the project,” one said. “We ask, was it worth it? Did it move the business forward? What are we able to do that we were not able to do before? Will it ultimately save us time and cost?”
The two companies with multiple metrics used the same metrics across projects. Another kept a single metric (company researcher satisfaction) across all projects. Five other companies used different success metrics per project (often this included recruitment). One didn’t clarify if their success metrics changed per project.
They often emphasized the benefits of failure, saying things like: “We’ve learned from our surveys that projects that fail scientifically are still incredibly valuable. We might have gone in there with a hypothesis that failed and that informed an entire program and made us change what we were doing. That’s wonderful. Though the science didn’t work, the project is a success.”
There was a similar approach to continuous measurement. Four companies (all companies with long-standing collaboration programs) assessed their academic collaboration projects throughout the process. Four companies preferred to measure success only at the end of the project’s term and one company said that their approach varied (this was the company with the most mixed approach overall).
Do these companies ever end unsuccessful projects early? Some said yes. “Terminate? Yes, we could do it for many reasons. It could be that a site isn’t working out. Or a project could be written, but six months in, it’s not working and they can’t repeat results. Or an idea just failed. We don’t waste their time, and we pull out. Project-by-project, the internal and external teams should know quickly if it’s not working. If we are funding a PhD student we never terminate an agreement. Sometimes we leave projects running longer, because we’re interested in talent acquisition. But if a scientist wants a result, and it doesn’t work, we pull the plug.”
One of the most metrically-oriented companies qualified that they don’t expect success every time. “We have a structured approach to assess if projects were worth doing or not,” they said. “Publications are not meaningful to us. Most patents are useless and don’t go into production. What we’re looking at is money saved, time saved, quality improvement, and things like that. We don’t attempt to monetize all projects. We don’t do an ROI per project. Most get nothing and some are really successful. You have to swing for the fences every time, but you can’t count on a home run. It’s a portfolio approach, and we assess the success of the whole portfolio.”
It’s clear that there is no right way to structure academic collaborations. Top-down or bottom-up matters much less than strategic consideration of projects, communication between business units, and an active and deeply engaged co-development-style relationship.
“It varies by institution and professor, but we are very high touch,” one participant said. “And we have to expectation-manage that up front. Generally they really appreciate our involvement. But we make sure we go in and say, ‘We want to work with you; that’s our style. Is that ok with you?’.”