Skip to main content
Chapter 13: University Technology Transfer Practices: Reconciling the Academic and Commercial Interests in Data Access and Use | Data for Science and Society: The Second National Conference on Scientific and Technical Data | U.S. National Committee for CODATA | National Research Council

U.S. National Committee for CODATA
National Research Council
Improving the Data Policy Framework
 


13

University Technology Transfer Practices:
Reconciling the Academic and Commercial Interests in Data Access and Use

Lita Nelsen




     The tragedy of the commons versus the tragedy of the anticommons that Steve Maurer addressed is part of what is guiding this whole discussion.1 Although I am very sympathetic to students' needs, coming as I do from a university, I think part of the trouble with turning data into information is that it takes a lot of money and a lot of motivation. Maybe we want data to be free but information to be profitable, because that is the only way we are going to get the incentive to keep developing the best of the analysis of information, which is what makes a database useful. I think this is part of what we are wrestling with.

     Let me give you what I see as the players in the university-industry interactions. The first player is the university, whose primary mission is the discovery of knowledge, the dissemination of knowledge, and education. The second player in this current debate is industry, which sees itself as primarily a profit-making entity, with the university providing useful knowledge and services, and we will come back to that, but where it is primarily proprietary based. Our third player is the individual researcher, who is very different in many ways from the employee of the government agency or even an employee of all except the most academic of nonprofits. A person in today's academic research world is primarily a member of a profession, not of an institution. The reward systems come from individual achievement. I think that one reason information gets sequestered is the need for the individual researcher to compete in the peer review system, to be first. Nobody wins the Nobel Prize by being second. When we see this problem of lack of information dissemination in the not-for-profit world, the motive is less often money and more fame, standing, status, peer review, or just plain number of grants received. This is partial background for the following discussion.

     There are two issues I want to talk about. The first is the university's and the individual researcher's need for access to data. In this issue, industry tends to be on the other side of the table. The second is a different thing, called technology transfer and research collaboration, where individual companies and universities are on the same side of the table. The other players--the public, other industry, and the government--are standing and watching that collaboration, not necessarily on the other side of the table.

     So let us take these two issues one at a time. First, access to databases: universities and, most important, individual researchers do not want to have to replicate anyone else's findings. They believe that knowledge should be free because they want to use it, and they don't have money to pay for it. The biggest concern for the university community in the proposed database legislation is the issue of fair use. This is fairly well defined in copyright law, but the new database legislation is far less clear. The issue for the researcher is, How much of the data in the database can I use without legal infringement--if any? Take as an example a table of melting points of metals that someone has compiled. I want to take those data and make a graph of the melting points versus the reflectivity of the metals I have been studying. Will I be infringing a property right in a database? Do I instead have to go back to the individual books and articles from which each of the melting points was obtained? What a terrible waste of time and society's research resources this "make work" would be. This is why the university community is fighting very hard to make sure that there is sufficient free access and fair use, however it will be defined.

     There is one somewhat mitigating factor where database protection is somewhat more societally reasonable, and that is time. If the time in which data are kept confidential is short enough, I think the world can live with it. The problem with copyright, even though copyright and database protection look the same, is that copyright lasts almost forever--125 years in some cases or 70 years after the author's death in others. This is too long for information to be tied up. At the same time, I think we have to provide some kind of incentive for people to turn their data and their collections into something useful. No one is going to do the work to turn raw data into useful information (databases and compiled data) if there isn't a possibility of substantial profit at the other end.

     Then there is the issue of the people who compile databases within the university and whether they are going to see this as a source of personal profit. Individual databases most likely will not be profitable because usually you cannot compile a database on your own time (too many resources are required) so some sort of grant is required, and the university will own the results. Most universities are committed to publication, so there is a reasonable presumption that the data will not be sequestered--by policy. Also, federal grant sponsors require the data to be published.

     Let us switch to the other subject that everybody worries about: If universities do business with industry, will universities stop publishing? Now we are getting out of databases and simply into data and other information. The interaction of universities with industry really goes back a long way. Historically, one of the original purposes of the land grant universities was to aid commerce (in those days, mostly agriculture) through the research and expertise in the universities. (Incidentally, did you know that the Massachusetts Institute of Technology (MIT) is a land grant university? Surely one of the most atypical ones in the country. It was chartered in 1861 to "bring science and the useful arts to industry.") There is also a long history of technical universities, certainly in the late nineteenth and early twentieth centuries, helping industry. Also, of course, consulting has gone on for a long time and generally has been thought of as a positive force in universities because it brought the real world into the university laboratory.

     A lot of university technology transfer offices started after 1980, with the passage of the Bayh-Dole Act. It was this act that gave universities the title to inventions coming out of federally funded research, and as most of you know, approximately 90 percent of research in universities is funded by the federal government. One major purpose of the act was to use America's head start in discovery research to foster technological innovation and economic development; but how could this be made to happen? The key was to use patents as a way of avoiding the classic tragedy of the commons--if everyone owns something, no one invests in it.

     The problem with university inventions and research discoveries is that they are not products; they are early-stage "development opportunities"--at best. Neither their commercial technical feasibility nor their market is proven. Companies are reluctant to invest in them because of the very high risk--and the additional risk that if the development is successful, competitors will come in after the first company has "shown the way." However, if the invention is patented, the universities can say to industry: "If one of you is willing to take the risk, and is qualified, we will grant you patent protection, such that if you succeed, your competition will be shut out for a certain period of exclusivity, and you can reap the rewards of your investment and risk taking." Thus, the primary purpose of university licensing (and the Bayh-Dole Act) is to get companies to invest in turning laboratory findings into real products, companies, and jobs. Universities hope to make a little bit of money out of it if they succeed, through royalties. It has taken universities a good decade to really learn how to work with intellectual property, to market, and to make agreements. I think this tragedy of the anticommons that Rebecca Eisenberg2 talks about is more of a learning curve that is taking longer than people thought. There have been some unwise licensing strategies, but if you look at the actual statistics, the numbers has been surprisingly small, and the universities are learning quickly. But the financial story has been somewhat surprising (at least to those with high expectations).

     Universities are now doing 3,000 agreements a year. That learning curve and the interest in intellectual property accelerated with the fall of the Berlin Wall. Why? Because universities got scared out of their minds that without the fear of the Soviet Union, the American public would no longer support fundamental research, and there was an expectation of a decline in federal funding of research. So the universities thought if defense isn't going to fund us anymore, we had better turn to industry. A few universities believed they would make lots of money from licensing, but none of them did. The average university including MIT, even if it has a very robust technology licensing program, is earning maybe 1 or 2 percent of its research budget from licensing.

     If licensing wasn't going to work, universities turned to industry to pay for research. So this became something that the universities all started to work on. How do we get industry to fund our research? They didn't look very hard at what the numbers were because if you actually looked at the basic research budget of industry, and if 10 percent of it went to universities, you wouldn't make up for a 15 percent decline in the federal budget for university research. Tapping industry's research budget wasn't going to work either, but we did see acceleration. At MIT we have gone from 6 percent industrial funding to about 20 percent, and our federal funding has not dropped. (The third motivation that pushed the interest of research hospitals in collaborating with industry was funding from drug research groups and health maintenance organizations. The profit that the hospitals and research centers had formerly made on the clinical work that went into their scientific research, dried up almost completely, and more so than the nonmedical universities that were really suffering from this and looking more to industry.) At the same time, the late 1980s and early 1990s brought a severe downsizing (or "right sizing" or whatever other euphemism or epithet you want to use) to industry in which basic research laboratories were closed down. By about 1990, after companies had eliminated much of their basic research through downsizing, they found themselves (although not for this reason) in pretty good economic shape--but without new products in the pipeline (not surprising, since they had not invested in research for many years). They did not want to rebuild their basic research capabilities, so they looked to universities for new products, or at least new product opportunities or intellectual property.

     So, these two historical issues came along (the universities needing money because of a downsizing of federal funding, and the companies needing new technology because of a downsizing of their own research labs), and the two sides really wanted to dance together. In the past, there had been some industrial support of research universities that was largely quasi-philanthropic--it made industry look good, and there wasn't much of a problem of conflicting policies. However, as the two sides really began to need each other, the clash of cultures became more evident, with access to data right in the middle of the clash. The university sees itself as primarily engaged in discovery, education, and dissemination of knowledge, and from a research point of view the emphasis is on discovery and dissemination. On the other hand, corporate America is inherently proprietary, with an emphasis on stockholder value, and a consciousness of competition. Also, it is a highly competitive place for industry; companies have to keep things from each other in order to survive. There are a number of bases of competition: sometimes it involves being better, cheaper, or faster--and sustaining this advantage. Frequently, however, intellectual property in various forms--be it patents, trade secrets, or information--has become more and more important as a competitive force.

     I have been talking about this for the last 7 or 8 years, but now we are in the information age in which data, as a basis of competition, are becoming more important than ever. The reason is partly because discoveries are happening so fast and partly because companies are looking for head starts. There is the need to be first in new technology. Six months is supposed to make a big difference in the pharmaceutical industry. It happens in new companies that have never sponsored research before (in universities who need industry money more and more)--suddenly, developing the basic contract becomes hard. Industry starts with, "We are paying for this, and we want you to keep these data confidential. We will own the patents and your investigators will do the following tasks because that is what our contracts look like when we hire consulting companies or other research companies." The university believes that this is absolutely antithetical to what it does. We believe that investigator-initiated curiosity and insight lead to what should be worked on next and that free dissemination of information is critical to our mission. Universities also want to own the patents, since they owned them when the government paid for the research.

     At this point industry says, "What are we paying for?"

     Over the last 5 years, the two sides have been learning to find compromises that will preserve value and principles on both sides. For the major universities, the ones who can get away with it--the elite if you like--it goes as follows. Universities still believe in investigator-initiated research and say to industry, "If you want tasks for hire please go somewhere else. There are lots of good places to go, but not here." We still believe that all our information--and I didn't say necessarily raw data, but everything that adds to the body of the knowledge of science and allows our investigators to be peer reviewed--is sacrosanct and must be publishable. We will not do confidential data, and we have to own the patents (because we have to make sure that the technology is properly developed, but more importantly so that our investigators, who have a right to royalties under federal grants, get treated the same way under industrial grants).

     What we have been able to show industry is that we can still make this work, and we want it to work. We can make it work by ensuring that we file patents to protect intellectual property. We look to own patents before we publish, and we file them very quickly so that our researchers can publish as quickly as they want to. Industry will benefit from getting the state of the art, working with us, getting that language in its head start on knowledge, not what is written in the journal articles, and it will get licenses to the patents for commercial development. However, data will not be kept private.

     We have been able to explain to industry that "you don't want us to be like you. If we are like you, why are you hiring us? It is still the ability of the universities to attract the best and the brightest and to take advantage of their curiosity and have them measured against world standards rather than proprietary work. This is why we are able to do for you what you want us to do, but we give you early access (by allowing you to hang around as the research is done) option to exclusive licenses, that will give you the competitive protection that you need." This paradigm has been working pretty well for us and for our industrial partners.

     However, it is harder for the less elite universities to hold the line against industry demands for more proprietary rights--such as confidential data. These universities are often more in need of funds and feel themselves in direct competition with other universities who might be willing to "give away more to get the business." I therefore believe that it is the obligation of the more elite universities to hold the line on principles (such as full freedom to publish, investigator-initiated research, ownership of patents), both for their own sakes and so that the "core principles of academia" can be preserved for their less powerful brethren.

     At MIT, we receive almost 20 percent of our research support from industry with some very major collaborations, but we are still able to hold these lines firmly. I think our success, and that of other universities who have managed to stick to their core principles in the face of economic forces, comes from remembering who we are and why we are in the game. We don't want to play just for the sake of growing, only to do what we do better and more widely for the public interest.

     So far, so good, but I think there are new challenges coming. One is the speed with which information and probably publication will happen, so that there will be very little time between when a manuscript is written in its first draft form and when it is out on the Web.

     The second challenge is when information rather than patents becomes the basis of industrial competition. Who owns the information? Is it the university or the individual researcher? Most university policies state that data collected under a grant to the university belong to the university, not the researcher. However, this is rarely enforced, and when the data are processed into "information," who owns them? As mentioned earlier in this conference, most university policies are decades old and policies on copyright are based on a model of textbooks and journal articles. Then what happens when a professor is teaching a physics course over the Web? Who owns the copyright? The policies current at most universities no longer work, and I don't know what the solution will be in this coming clash between the rights of the individual faculty member and the rights of the university in the newly valuable "content" arena. Nobody that I know of in the university community has decent policies on ownership of content by faculty.

     The third challenge appears as new forms of intellectual property protection rights come in, such as database protection. Our sponsor is now saying, "Hey, you are willing to give us exclusive licenses to patent, and we have agreed that we will pay for the exclusivity, and that's okay. How about your databases? We need exclusivity there now." Universities are going to have to say, "Gee, we never thought about that. We don't know. We understand exclusive license to patent. We provide some incentive for you to develop the technology, and we like the idea, and we put in various conditions, and we can do that. But databases? That's information. We don't know if we can grant exclusivity to that." You asked me about the debate on this in the university community. I have never even entered the issue until today. So, we don't know where that one is going to go.

     Data sharing looks like the same problem, but it isn't. It doesn't usually apply to the relationship with industry. It relates once again to the individual. Think about the Human Genome Project, for example. The Genome Project basically says, "We are going to pay the universities to do mass production and require that they put all information out on the database within 24 hours." This policy is just fine for the principal investigators and other researchers with fame, tenure, and everything they need. However, for the individual postdoc or junior professor who would like to sequester the data for a while and do some real research (rather than just the boring crunching) and actually discover something (rather than just put it out on the Web and let somebody with more resources do the interesting work), this requirement for "instant sharing" is getting very much in the way of the academic process. Again, I don't know where society's best interests in this lie. In what we assume to be the totally virtuous aim of making all information freely and instantly available, we should remember that proprietary interests can sometimes be beneficial in that they can provide societally useful incentives in not just the commercial, but also the academic, sphere. So should all data and information be disseminated as soon as it comes into existence, or might there be a reasonable middle ground? Do we have another framework for thinking about this? How will we do that?

     The only really interesting questions in life are the hard ones--those that have no simple black or white answers. So we should feel fortunate, we have lots of interesting questions to work on in this complex field of data sharing, data access, and use.

      



Notes

1 See Chapter 11, "Intellectual Property Law and Policy Issues in Interdisciplinary and Intersectoral Data Applications," of these Proceedings for additional information.

2 M.A. Heller and R.S. Eisenberg. 1998. "Tragedy of the Anticommons," Science 280:698.



Copyright 2001 the National Academy of Sciences

PreviousNext