Good practices in system dynamics modelling

By Sondoss Elsawah and Serena Hamilton

sondoss-elsawah
Sondoss Elsawah (biography)

Too often, lessons about modelling practices are left out of papers, including the ad-hoc decisions, serendipities, and failures incurred through the modelling process. The lack of attention to these details can lead to misperceptions about how the modelling process unfolds.

serena-hamilton
Serena Hamilton (biography)

We are part of a small team that examined five case studies where system dynamics was used to model socio-ecological systems. We had direct and intimate knowledge of the modelling process and outcomes in each case. Based on the lessons from the case studies as well as the collective experience of the team, we compiled the following set of good practices for systems dynamics modelling of complex systems.

Good practices in the model scoping and conceptualization phase:

  • Account for the time and resources required for evaluation and iterations in the proposal and planning phase.
  • Taking a step-wise approach is critical, especially for identifying the conceptual model elements with stakeholders. This stakeholder engagement should occur over multiple sessions; it can be too overwhelming to build a model in one sitting.
  • Remind stakeholders that modifying the hypotheses is relatively simple, so they can begin with an initial interpretation and change it easily. Also, the modeller should not assert too much pressure over including feedback loops as some stakeholders may find it difficult or irrelevant.
  • Elicit knowledge on the formulation of decision rules as well as the system at stake.
  • Be aware of the limitations related to the different methods used to elicit and visualize the dynamic hypothesis, and how they may affect the final model and its application/use.
  • Also, leverage the strength of various elicitation and mapping techniques throughout the modelling process utilizing approaches such as pairing methods, and detailed variant maps.

Good practices in the model formulation phase:

  • Build a simple model first – identify the key variables, key decisions, main functions and primary behaviours. Once one has a first simple version of the model, think about the reference behaviours, ie., does it behave as expected? More detail, including different metrics, can be built in iteratively.
  • Reflect on the model as it advances. At each model iteration, consider whether the model is aligned with its objective and scope.
  • Make use of (already tested) model structures (aka model modules) when pertinent and available. This reuse may include model components built using other modelling approaches to form hybrid system dynamics models. Sometimes other modelling approaches can simulate parts of the dynamic hypothesis better or easier than system dynamics.
  • When your model consists of various modules, first test these components individually, next in pairs, threesomes and so on, in order to manage model complexity. Also allow for a thorough investigation of each relationship between core processes.
  • Make smart use of prototypes to give users an appreciation of the final model capability while avoiding the risk of overshooting their expectations.
  • Pay attention to (spatial and temporal) scaling and reporting unit questions when considering the model’s objective.
  • Avoid hiding parameters in equations. Having them explicit makes the model more transparent (to stakeholders and users) and easier to update.
  • Make use of software development methodologies (eg., the Vee development process) and practices (eg., version control) to structure the way you develop and test the model.
  • Make careful use of arrays and subscripts as they can be complex to develop and test, and hide some complexity of the model structure. Develop and test a full version of the single dimension (non-arrayed) model before adding dimensions.
  • Calibrate the model using historic data where possible, even if the data are only available for part of the system.

Good practices in model evaluation:

  • Make use of logical reasoning and expert judgement to assess the structural validity of each of the interactions, and the complete set of interactions.
  • Perform behavioural testing with data in parts of the model where available. This should be complemented with other forms of model evaluation including peer review, sensitivity analysis, uncertainty analysis, robustness checks and comparison with other models.
  • Whether using specific data to populate a function or inferring a reference behaviour, stress-test to ensure that the model reproduces the system behaviour as closely as possible across the range of potential scenario or decision variable settings.
  • Test “on the go”, ie., test small components before uncertainty grows “out of control”. When all components are tested, an integrated/whole of system test is essential.

Good practices in model use:

  • Ensure that the final model is only delivered to end users for their purposes after it has undergone and passed rigorous model evaluation. If released prematurely, errors in the model (including bugs, or model structural or behaviour errors) may diminish their confidence in using the model, even if the errors are subsequently fixed.
  • Link the model behaviour back to the sources of dynamics (eg., feedback loops and delays). Make use of conceptual models (eg., causal loop diagrams) developed throughout the process to complement the discussion.
  • Ensure the tools are well documented with adequate ‘help’ resources available online. Over-reliance on the developers for technical support is unwise and often limits uptake of the model.
  • Clearly discuss or describe the limits of the model, including possible inconsistencies with expected behaviour. Also note what the model best represents, including which behaviours are expected to be good indicators of response.

Good practices in software selection:

  • Explore the strengths and limitations of software platforms for specific technical and participatory modelling requirements early in the project, as software selection can have large implications for the modelling capabilities. If unsure about specific requirements, start with an easy-to-use, open access package (eg., InsightMaker) until there is a better understanding of the required functionalities.
  • In general, consider the use of system dynamics software which has active user communities as they are more likely to provide adequate information, communication, and support for modellers.

Although we used system dynamics models as the common lens from which lessons were drawn, many of these insights are applicable to other modelling approaches. Are any of these practices useful for challenges you face? Are there good practices you can add to these lists?

To find out more:
Elsawah, S., Pierce, S., Hamilton, S.H., van Delden, H., Haase, D., Elmahdi, A., Jakeman, A. J. (2017). An overview of the system dynamics process for integrated modelling of socio-ecological systems: Lessons on good modelling practice from five case studies. Environmental Modelling and Software, 93: 127-145. Online: http://www.sciencedirect.com/science/article/pii/S136481521631091X

Biography: Sondoss Elsawah is a senior lecturer at the University of New South Wales, Canberra, Australia. She comes from an operations research background. Her research focuses on the development and use of multi-method approaches to support learning and decision making in complex socio-ecological and socio-technical decision problems. Application areas include natural resource management and defence capability management. Her recent work focuses on how to integrate and transfer knowledge across projects and application domains to improve the practice and teaching of systems modelling methodologies. She is member of the Core Modeling Practices pursuit in the theme “Building Resources for Complex, Action-Oriented Team Science” funded by the National Socio-Environmental Synthesis Center (SESYNC).

Biography: Serena Hamilton is a Postdoctoral Research Fellow at the Centre for Ecosystem Management at Edith Cowan University, Western Australia. Her research interests include integrated assessment and modelling, Bayesian networks and decision support tools for water resources management. Her recent research focuses on modelling for improving understanding of system linkages and management of complex socioecological systemsShe is member of the Core Modeling Practices pursuit in the theme “Building Resources for Complex, Action-Oriented Team Science” funded by the National Socio-Environmental Synthesis Center (SESYNC).

7 thoughts on “Good practices in system dynamics modelling”

  1. … and please always keep in mind that SD is not the only and best modeling paradigm and tool out there. Being myself an active SD modeler I do know how easy it is to start thinking about the whole world around us in terms of stocks and flows, but also realize that this may be quite limiting for the results of the modeling project.

    Reply
  2. A good refresher on modelling practice. Thanks Sondonn and Serena. I will reading the actual paper too.
    Any thoughts on how to engage stakeholders (mid to senior level government officials) with no prior knowledge of SD modelling.

    Reply
    • Thanks, Salman for for your question. It depends on the purpose of the engagement (e.g. get their input re developing a particular part of the model, or communicating about results…etc). Do you have a specific example in mind?
      Sondoss

      Reply
        • ‘Selling’ is the very keyword here, Salman. When selling something, we need to present the value proposition of it. In other words, we need to show the value it adds to their business. On doing this, I’d not try to tell a story around stocks, flows, …etc or anything that fascinates us as modellers. Instead, I would focus on the possible solutions that the modelling approach can offer. One good practice is to pick up 2-3 examples of problems the organisation has and use as a vignette to showcase what the model can offer towards solving these problems. Archetypes can be useful in telling a story about the type of problems, and then explain how the model can help in avoiding to fall into such dynamics. Do you think this might work in the type of problems/context you are interested in? I’ll be keen to see to validate it against your experience and knowledge of the domain of interest.

          Reply

Leave a Reply to Sondoss ElsawahCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Integration and Implementation Insights

Subscribe now to keep reading and get access to the full archive.

Continue reading