Sophie Haines and Ana Lopez of the Oxford Martin Programme on Resource Stewardship look at the challenges and opportunities that arise when researchers from different disciplines come together for a common goal.
As researchers working in ‘interdisciplinary’ environments, it is important to reflect on what this means. What practices can we use to help navigate complicated research pathways? What are the challenges and opportunities?
The Oxford Martin School brings researchers from different disciplines together to tackle multidimensional questions. In our project, anthropologists and physicists have joined forces to explore the role of weather and climate predictions for natural resource decision-making. While producing knowledge beyond the limits of academic disciplines is not new, there seems to be a growing concern with bringing together physical and social sciences to address environmental challenges. Is the motivation to make ‘science’ answerable to ‘society’? To link up science, technology and markets? Or to address philosophical questions about the nature and status of different types of knowledge?[1]
We usually think of academic disciplines as bounded fields of study with distinct methods, concepts and subjectivities, which scholars develop through years of training. These boundaries can be crossed and questioned in different ways. ‘Multidisciplinary’ work often describes disciplines co-operating in parallel. ‘Transdisciplinary’ work is often seen as more radical - transforming or even abandoning established fields.[2] Our approach sits somewhere between, in line with Rayner & Malone’s description of “people from different fields working on a problem that they have defined together in a way that it cannot be defined from within any single discipline”.[3]
Researchers from different disciplines often have distinct approaches to choosing research questions and methods. Anthropologists often ask open-ended questions, focusing on local details and subjective values. Physicists test particular hypotheses, seeking objective data and universal laws. In the early stages of the project, we proposed two parallel approaches. Searching for the ‘spherical cow’ in this context, the physicist thought: ‘if only the forecast were perfectly reliable, people would use it to guarantee rational decisions. Any other barrier would be overcome’. Meanwhile, the anthropologist sought to highlight how complicated such problems are: ‘science is only one type of knowledge among many, and ‘rational choice’ is not helpful for understanding how people make value-laden decisions in the real world.’
We then discussed our disciplinary assumptions, revisiting the meanings of ‘perfectly rational decisions’, ‘reliability’, ‘evaluation’, ‘verification’, ‘decisions’ - even ‘forecasts’. We found common interests in exploring how different people and organisations judge the success of a forecast or decision, and thought-provoking parallels between our project’s focus on the ‘success’ of scientific predictions and policy decisions, and our need to reflect on the meanings of ‘success’ of our own work.
This has involved stepping out of our comfort zones and into unfamiliar settings. A physicist has conducted semi-structured interviews. An anthropologist has studied ensemble forecasts and presented for audiences of meteorologists. These things take time, patience, respect, trust, and a good sense of humour. It can be tempting for us to speak on behalf of ‘physical science’ or ‘social science’, but we are individuals, and our disciplines are internally diverse. Jointly-produced documents like glossaries and ethics committee submissions have helped us pin down what we are doing. This is an ongoing process. At a project meeting, a physicist showed a slide containing equations. A social scientist asked for a summary ‘in plain English’. ‘Plain English’ can be challenging for everyone: words as well as equations need care and explanation.
While a growing number of institutions and funders – like the OMS – support and encourage interdisciplinary projects, this is not universal. As Tom McLeish writes, there have been changes in how the UK Research Excellence Framework recognises interdisciplinary work, but many questions remain. Ismael Rafols highlights the issue of journal rankings, for example. Different disciplines have different publication expectations and timescales. It is challenging to write for generic rather than specialist audiences, or for specialists in different fields. We are aiming to understand an issue holistically, but why would a model developer or forecaster care about decision makers’ aversion to changing the ‘business as usual’ approach when that might put her job at risk? There are implications for the futures of projects, and of individual researchers (particularly those at earlier career stages), so these challenges deserve careful attention from the outset.
- We thank our colleagues and participants in the OMPORS Usability of Forecasts project, for ongoing conversations that have contributed to this piece.
[1] Barry, A., G. Born & G. Weszkalnys (2008) Logics of Interdisciplinarity. Economy and Society 37(1):20-49
[2] Nowotny, H. P. Scott & M Gibbons (2001) Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty. Cambridge: Polity.
[3] Rayner, S. & E.L. Malone (eds) 1998. Human Choice and Climate Change, vol 4: What Have We Learned? Columbus: Battelle Press. p51
This opinion piece reflects the views of the author, and does not necessarily reflect the position of the Oxford Martin School or the University of Oxford. Any errors or omissions are those of the author.