When Banting and Best discovered insulin, a proud moment in our history, we had no hesitation in placing them on a pedestal. Then they won the Nobel Prize and our opinion was confirmed!
Some years later, in 1972, Saran Narang, in a National Research Council lab, perfected a way to synthesize genes, which led to the creation of synthetic insulin, saving many lives and contributing to the prosperity of pharmaceutical companies. Was this discovery basic or applied science? What was the difference? Both projects involved experimentation. Both were discoveries. Both had applications.
In his lab at Dalhousie University in Halifax, Jeff Dahn is working on creating a lighter, less expensive and longer-lasting battery, and is using the fundamentals of chemistry to do so.
Nuclear physicists in Ottawa are creating better ways to image tumours. Theoretical physicists who seek the origin of the universe are aware of the applications of their work to quantum mechanics and the capturing of energy.
Researchers in a 3D printing lab at Seneca College are producing models of organs so doctors can practise prior to operating. At the same time, they are discovering better techniques and materials, and opening the door to new possible applications.
What is the difference between basic and applied science? One may perhaps be more directed than the other. The discovery may be the byproduct – even accidental – or it might be the main purpose of the experimentation. Both require labs, equipment, thoughtful design and experimentation. Neither knows the results in advance.
Perhaps the main difference is the timeline. For what is generally considered to be applied research, there is the expectation that the work will generate results in a measurable and shorter time. There is a greater probability of success for both basic science and applied science when the problem is clearly defined and the parameters of the research are limited. The scientist knows one of the metals being tested will result in a lighter aircraft. On the other hand, the geneticist knows the same thing – it just might take 20 years to find the right gene. (When this process is shortened by more powerful and sophisticated computers, will that change the definition of the research?)
Do researchers doing basic science have no idea what they are about to research? That’s unlikely as scientists need to define their work before stepping into the lab, even if discoveries are sometimes accidental. Heparin, a blood thinner, was discovered in 1916 by Jay McLean and William Henry Howell, a student and professor looking for a clotting agent. Heparin was put on the shelf for a long time until someone was searching for a blood thinner.
Who pays for the experiment is another possible difference. At the Gairdner Awards in 2017, Huda Zoghbi, director of the Jan and Dan Duncan Neurological Research Institute at the Texas Children’s Hospital and investigator at the Howard Hughes Medical Institute, thanked those who had patiently invested in her research. She went on to say that it took her 13 years to achieve the results that will make possible the understanding of a disease and possibly lead to a treatment and cure. Will those who figure out how to turn Dr. Zoghbi’s discovery into a cure be any less scientists and will their results be any less important?
Research matters. Investment in research will lead the way to important and significant social and economic developments. Whether we call it basic or applied, what matters is excellence and originality. Let us not worry about terminology. Red herrings make interesting debates, but they do not advance our shared purpose.
Roseann O’Reilly Runte is president and CEO of the Canada Foundation for Innovation.
I would take this argument one step farther. There is a dysfunctional dichotomy between the emphasis on theoretical understanding v. practical application, which I do not believe is a red herring. Within the academy, the requirement for a sound theoretical foundation for research is the norm, but outside, where research findings are applied to practical problems there is little to no concern about who’s theory was most accurate.
The divide between theory and practice seems to have widened over the past few decades, widening the town and gown divide as well. I often wonder: if we are not conducting research to contribute to solving real world problems, why are we doing it? If the answer is to satisfy curiosity, at least let’s be honest about it. But let’s also be certain that the practical applications of research are respected and valued.
Why is this an issue for me? I’m job hunting (social sciences/planning with a sustainability/climate change adaptation, resilience, and change management focus) and I occasionally encounter situations that seem attractive but end up being so theory-focused that there is little or no room for using action research or promoting practical applications, whereas my focus is on delivering theoretically sound practical solutions, or that of a ‘pracademic’.
Somehow, a balance is needed, but what that balance is undoubtedly requires discussion and updating to current conditions. I think this discussion needs to involve people beyond the academy unless the decision from within is to increase insularity, which I think would be a serious mistake.