Idea tree: A tool for brainstorming ideas in cross-disciplinary teams

Community member post by Dan Stokols, Maritza Salazar, Gary M. Olson, and Judith S. Olson

Dan Stokols (biography)

How can cross-disciplinary research teams increase their capacity for generating and integrating novel research ideas and conceptual frameworks?

A key challenge faced by research teams is harnessing the intellectual synergy that can occur when individuals from different disciplines join together to create novel ideas and conceptual frameworks. Studies of creativity suggest that atypical (and often serendipitous) combinations of dissimilar perspectives can spur novel insights and advances in knowledge. Yet, many cross-disciplinary teams fail to achieve intellectual synergy because they allot insufficient effort to generating new ideas. Here we describe a brainstorming tool that can be used to generate new ideas in cross-disciplinary teams.

Maritza Salazar (biography)

The idea tree exercise

This exercise is straightforward and requires few resources other than pens or pencils, blank sheets of paper, and a table at which eight to ten team members representing two or more disciplines are seated as shown in the image near the bottom of this post. At the start of the exercise, each participant is given a blank piece of paper and asked to work independently and write their initial ideas at the top of the page. Depending on the specific group task, the ideas can relate to a research question or hypothesis, a new concept or method, or outline for a proposed study. The scope of the brainstorming task can be left relatively open-ended, or focused more narrowly on particular topics relevant to the cross-disciplinary team (eg., neuroscience, climate change, health disparities research). During the three to four minutes allotted for this part of the exercise, individuals are encouraged to avoid being too self-critical of their own entries, even when their ideas seem too preliminary or provocative.

Gary M. Olson (biography)

Once all participants have written down their ideas, they each pass their page to the person sitting next to them. Participants are asked to adopt a supportive and inclusive stance toward the ideas they have now received, taking three to four minutes to write a brief reaction. For example, they can elaborate on the original idea or pose a question about it. Each sheet of ideas is then passed on again, making its way to each participant around the table and gathering additional entries extending the thread of ideas triggered by the initial prompt.

Judith S. Olson (biography)

After the pages have been reviewed and annotated by all participants, they are returned to the individuals who wrote the initial entry. Thus, every member of the team receives a page containing several elaborations of his or her initial idea from the respective vantage points of other participants.

Each completed page, in effect, reflects a branch of the overall idea tree created by the group as a whole. The idea tree tool is designed to harvest several new ideas in a relatively short period of time and facilitate serendipitous combinations of disparate views among members of cross-disciplinary teams.

With eight to ten participants, a complete round of this brainstorming exercise can be done in about 45 minutes to an hour.

General insights gleaned from the idea tree exercise

Besides generating several new research ideas among team members, the exercise elicits more general insights about brainstorming processes in team science. After participants have finished commenting on the ideas offered by other team members, they are invited to discuss the results of their collaborative brainstorming, which commonly reveals the following insights:

  • conceptual synergy is unleashed by connecting disparate (and sometimes competing) viewpoints as a basis for discovering new ideas
  • a relatively large number of ideas can be generated when participants adopt a supportive, inclusive stance toward each other’s entries and agree to take only a few minutes to write down each of their own thoughts and reactions to others’ ideas
  • deadlines and time pressure can assist discovery of new ideas
  • using diagrams and drawings to formulate and convey one’s ideas has generative value.

Conclusion

A crucial goal of cross-disciplinary research teams is the creation of new ideas and conceptual frameworks that advance knowledge within and across fields. The idea tree has proven useful as a brainstorming tool in cross-disciplinary training and research settings. The exercise is especially helpful in prompting novel links between disparate ideas, although the processes of prioritizing, refining and integrating insights derived from the idea tree require longer-term collaborative discussion and metacognition, as described in Machiel Keestra’s recent blog post. Not all of the ideas gathered through the exercise will be deemed sufficiently novel and useful to warrant further development. However, the more time and effort allocated by research teams to knowledge creation and integration activities, the better their prospects for achieving cross-disciplinary insights that trigger scientific and societal advances. The idea tree can be used by research teams at repeated intervals to build their capacity for knowledge discovery and integration. The idea tree is one of the methods used in our research and training initiatives at University of California, Irvine’s Team Science Acceleration Lab described in our previous blog post on strengthening the ecosystem for effective team science.

Have you used alternative brainstorming tools in cross-disciplinary research teams? We welcome your thoughts about other idea generation tools and your reactions to the idea tree exercise if you have occasion to use it in an educational or research context.

Idea tree brainstorming group (photograph by Dan Stokols)

Some additional resources for collaborative brainstorming:
Adams, J. L. (2001). Conceptual blockbusting: A guide to better ideas. 4th edn, Basic Books: Cambridge, Massachusetts, United States of America

Gordon, W. J. J. (1974). Some source material in discovery-by-analogy. Journal of Creative Behaviour, 8: 239-257

McKim, R. H. (1980). Thinking visually: A strategy manual for problem solving. Wadsworth: Belmont, California United States of America

Mills, C. W. (1959). The sociological imagination (Appendix: On intellectual craftsmanship, pp.195-226). Oxford University Press, New York, United States of America

Uzzi, B., Mukherjee, S., Stringer, M., and Jones, B. (2013). Atypical combinations and scientific impact. Science, 342, 6157: 468-472

Wicker, A. W. (1985). Getting out of our conceptual ruts. American Psychologist, 40: 1094-1103

Acknowledgement:
We thank the Office of Research and the Center for the Neurobiology of Learning and Memory at the University of California, Irvine, USA for their support of this work.

Biography: Dan Stokols is Chancellor’s Professor Emeritus at the University of California, Irvine, USA and served as founding Dean of the university’s School of Social Ecology. His research spans the fields of social ecology, environmental and ecological psychology, public health, and transdisciplinary team science. He is author of Social Ecology in the Digital Age and co-author of Enhancing the Effectiveness of Team Science.

Biography: Maritza Salazar is an assistant professor at the Paul Merage School of Business at the University of California, Irvine, USA. Her research focuses on learning and innovation in teams and organizations, especially enhancing the competitiveness of firms, the effectiveness of teams, and the quality of the work experience for individuals. She serves as President of the International Network for the Science of Team Science (INSciTS).

Biography: Gary M. Olson is Professor Emeritus and formerly Donald Bren Professor of Information and Computer Sciences at the University of California, Irvine, USA. The focus of his work has been on how to support small groups of people working on difficult intellectual tasks, particularly when the members of the group are geographically distributed. He co-edited (with Ann Zimmerman and Nathan Bos) Scientific Collaboration on the Internet.

Biography: Judith S. Olson is the Donald Bren Professor of Information and Computer Sciences Emerita in the Department of Informatics at the University of California, Irvine, USA. For over 20 years, she has researched teams whose members are not collocated. She co-authored (with Gary Olson) Working Together Apart: Collaboration over the Internet.

 

Trust and empowerment inventory for community groups

Community member post by Craig Dalton

Author - Craig Dalton
Craig Dalton (biography)

Community groups are often consulted by researchers, government agencies and industry. The issues may be contentious and the relationship vexed by distrust and poor communication. Could an inventory capture the fundamental sources of community frustration and highlight scope for improvement in respect, transparency, fairness, co-learning, and meeting effectiveness from a community perspective?

The trust and empowerment inventory presented below is based on the main sources of community frustration that I have witnessed over two decades as a public health physician and researcher liaising with communities about environmental health risks and it is likely to have broader relevance. Key issues include not being listened to; not being fully informed; Continue reading

Conceptual modelling of complex topics: ConML as an example / Modelado conceptual de temas complejos: ConML como ejemplo

Community member post by Cesar Gonzalez-Perez

cesar-gonzalez-perez
Cesar Gonzalez-Perez (biography)

A Spanish version of this post is available

What are conceptual models? How can conceptual modelling effectively represent complex topics and assist communication among people from different backgrounds and disciplines?

This blog post describes ConML, which stands for “Conceptual Modelling Language”. ConML is a specific modelling language that was designed to allow researchers who are not expert in information technologies to create and develop their own conceptual models. It is useful for the humanities, social sciences and experimental sciences. Continue reading

Scatterplots as an interdisciplinary communication tool

Community member post by Erin Walsh

erin-walsh
Erin Walsh (biography)

Scatterplots are used in many disciplines, which makes them useful for communicating across disciplines. They are also common in newspapers, online media and elsewhere as a tool to communicate research results to stakeholders, ranging from policy makers to the general public. What makes a good scatterplot? Why do scatterplots work? What do you need to watch out for in using scatterplots to communicate across disciplines and to stakeholders?

What makes a good scatterplot? Continue reading

Skilful conversations for integration

Community member post by Rebecca Freeth and Liz Clarke

Rebecca Freeth (biography)

Interdisciplinary collaboration to tackle complex problems is challenging! In particular, interdisciplinary communication can be very difficult – how do we bridge the gulf of mutual incomprehension when we are working with people who think and talk so very differently from us? What skills are required when mutual incomprehension escalates into conflict, or thwarts decision making on important issues?

It is often at this point that collaborations lose momentum. In the absence of constructive or productive exchange, working relationships stagnate and people retreat to the places where they feel safest: Continue reading

Assessing research contribution claims: The “what else test”

Community member post by Jess Dart

Jess Dart (biography)

In situations where multiple factors, in addition to your research, are likely to have caused an observed policy or practice change, how can you measure your contribution? How can you be sure that the changes would not have happened anyway?

In making contribution claims there are three levels of rigour, each requiring more evaluation expertise and resourcing. These are summarised in the table below. The focus in this blog post is on the basic or minimum level of evaluation and specifically on the “what else test.” Continue reading