Courses: The GoldSim Contaminant Transport Module:

Unit 3 - Introduction to Contaminant Transport Modeling Using GoldSim

Lesson 9 - GoldSim Modeling Philosophy

The large uncertainties associated with contaminant transport modelling discussed in the previous Lesson strongly informs the modeling philosophy embodied in GoldSim. This philosophy revolves around the idea that he complexity and detail that you include in your model should be consistent with the amount of uncertainty in the system. In particular, when building a model and deciding whether to add detail to a certain process, you should not just ask if you can (e.g., can you use a more detailed equation or more discretization?), but you should ask if you should (does the amount uncertainty I have in this process justify a more detailed model?)

An easy way to illustrate this is to consider a trivial example.  Imagine that we had a process that was described using the equation Y = A + B. A and B have similar magnitudes (“best estimates”). However, A is a parameter that has three orders of magnitude of uncertainty (that cannot be easily reduced).  B, on the other hand, only has an uncertainty of one order of magnitude.  By using a more detailed model, the uncertainty in B could be cut in half.  Should we add more detail to the model of B?  What we need to ask is how reducing our uncertainty in B will reduce our uncertainty in the “answer” (i.e., Y). Of course, in this trivial example, reducing the uncertainty in B has no significant impact on the uncertainty in Y (because it is dominated by the uncertainty in A).

In the real world, of course, the models are not so trivial. But the basic idea still applies: we should only add detail (to reduce uncertainty) to a process or parameter if by doing so we can reduce our uncertainty in the ultimate measure that we are trying to predict or the question that we are trying to answer. (And as discussed in Unit 11, Lesson 10 of the Basic Course, probabilistic modeling supports uncertainty and sensitivity analyses that allows you to specifically determine if this is the case.)

This discussion may seem obvious, but anyone who has reviewed environmental models will readily acknowledge that most models have details that simply cannot be justified. The problem is not just that you are wasting time and money by adding these details. The problem is that such models can be misleading. The extreme detail can often act to mask the fact that there are huge uncertainties in the model, making the model seem more “correct” than it really is.

Top-Down Modeling

To avoid this problem, GoldSim supports and embodies a top-down modeling approach.

In general terms, there are two ways to approach any kind of modeling problem: from the "bottom-up", or from the "top-down".  “Bottom-up” modeling approaches attempt from the outset to model the various processes in great detail, and typically make use of complex process-level models for the various system components.  The emphasis is on understanding and explaining all processes in great detail in order to eventually describe the behavior of the entire system. 

While such an approach may seem at first glance to be "scientifically correct", for the following reasons it is generally not the best way to solve many real world problems:

  • The level of detail in a model developed from the bottom-up often becomes inconsistent with the amount of available information and the uncertainty involved. That is, a model is only as good as its inputs, and if you don't have much information, a detailed model is generally no better than a simple one.
  • It is often difficult to appropriately integrate and capture interdependencies among the various model components in a bottom-up model, since it is often impossible (or computationally impractical) to dynamically couple the various detailed process-level models used for the components of the system.  As a result, important interactions in the system are often intentionally or unintentionally ignored in a bottom-up model. 
  • It is easy for a bottom-up modeling project to lose sight of the "big picture" (i.e., to "not see the forest for the trees").  As a result, such an approach can be very time-consuming and expensive, with much effort being spent on modeling processes that prove to have little or no impact on the ultimate modeling objectives. 
  • Finally, such models tend to be very difficult to understand and explain (and hence be used) outside of the group of people who create them.

In a "top-down" modeling approach, on the other hand, the controlling processes may initially be represented by approximate (i.e., less detailed or “abstracted”) models and parameters. The model can then evolve by adding detail (and reducing uncertainty) for specific components as more is learned about the system.  Such an approach can help to keep a project focused on objectives of the modeling exercise without getting lost in what may prove to be unnecessary details. Moreover, because a properly designed top-down model tends to be only as complex as necessary, is well-organized, and is hierarchical, it is generally easier to understand and explain to others.

There are four key points in the application of a top down modeling approach:

  • Top-down models must incorporate a representation of the model and parameter uncertainty, particularly that due to any simplifications or approximations.
  • As opposed to representing all processes with great detail from the outset, details are added only when justified (e.g., if additional data are available, and if simulation results indicate that performance is sensitive to a process that is currently represented in a simplified manner).  That is, details are added only to those processes that are identified as being important with respect to your modeling objectives and where additional detail will reduce the uncertainty resulting from model simplifications.
  • A top-down model does not have to be “simple”. Whereas a “simple” model might completely ignore a key process, a well-designed top down model approximates the process at an appropriate level while explicitly incorporating the resulting uncertainty that is introduced.
  • A top-down model lends itself to being a “total system” model, in which all relevant processes are integrated into a single coupled model.

Iterative modeling

It is important to understand that in some cases a top-down model may become very complex indeed. The key point, however, is that it generally does not start out as a complex model.  Rather, it evolves over time to a level of complexity that is appropriate given the modeling objectives, information about the system, and the uncertainty in that information.

This is accomplished by carrying out the various modeling steps iteratively, responding to new information and/or preliminary modeling results. That is, modeling any system should be an iterative process:

The basic concept is that as new data are obtained (e.g., through a data collection program or research) and/or as new insights into the behavior of the system are obtained (based on preliminary model results), you should reevaluate and refine the model. In some cases, you might “loop back upward” in the middle of the process.  For example, development of a conceptual and corresponding mathematical model is dependent on the type of data available. If you determine that your available data is highly uncertain, the conceptual and mathematical models may need to be adjusted accordingly (e.g., if data is sparse and uncertain, a highly detailed model would likely be inappropriate).

Evaluating the results may include a consideration of the model’s support of a decision to be made. If the model results show, for example, that in most cases a regulatory criterion is not exceeded, then perhaps a decision maker has sufficient comfort to move forward. If, however, a significant amount of uncertainty in the result indicates that in many cases the criterion is not met, then the decision maker might require less uncertainty. A sensitivity analysis of the results can identify the larger sources of uncertainty in the model, and, with some effort, these might be able to be reduced. One then must determine the value of doing so, which may incur some expense in reducing uncertainty in some key parameters, such as performing field work or chemical analyses. If the value of this information is deemed sufficient, since it may lead to better information on which to base an expensive decision, then one may attempt to reduce the uncertainty in an important model result by reducing uncertainty in the key parameters. This process would iterate until a decision can be made with confidence and defensibility.

Simulation models that are constructed and continuously modified in this manner can then do more than only provide predictions of performance; they can provide a systematic framework for organizing and evaluating the available information about the system, and can act as a management tool to aid decision-making with regard to data collection and resource allocation (what should be studied, when, and in what detail?).

How GoldSim Supports Iterative, Top-Down Modeling

So what does a “top-down”, iterative approach actually mean in terms of building contaminant transport models in GoldSim?

For those of you experienced with building detailed, high-resolution models of environmental systems (e.g., using finite element or finite difference tools), you will find that GoldSim is not like those tools at all.  It is different in three fundamental ways:

  • Spatial Resolution: Whereas a finite element or finite difference model would typically have a very high level of discretization (effectively thousands of finite volumes), a GoldSim model would have a much lower level of spatial resolution (perhaps tens or hundreds of finite volumes). Due to this lower spatial resolution, processes will typically be “abstracted”, “lumped” and/or averaged to some extent.
  • Integration: Detailed, high-resolution models are typically designed to do one thing very well (e.g., model groundwater flow and transport or model geochemistry). It is often difficult to appropriately integrate and capture interdependencies among the various model components when using such an approach, since it is often impossible (or computationally impractical) to dynamically couple the various detailed models used for the components of the system.  As a result, important interactions in the system are often intentionally or unintentionally ignored.  GoldSim, on the other hand, allows you to build a single model that focuses on integrating and coupling all system components. 
  • Uncertainty: For computational reasons, it is quite difficult to deal with uncertainty in detailed, high-resolution models. GoldSim, on the other hand, allows you explicitly represent the (typically high) uncertainty present in contaminant transport models.

So how do we represent complex multi-dimensional environmental systems in GoldSim using a low level of spatial resolution? As we will discuss in the next Lesson, in GoldSim we will build abstracted and lumped representations of complex multi-dimensional systems by appropriately connecting together zero-dimensional (mathematically, the well-mixed tanks described in previous Lessons) and one-dimensional (mathematically, the tubes described in previous Lessons) components. In subsequent Units, we will describe in detail how to build these simple components; in a later Unit we will illustrate how they can be combined to represent higher-dimensional complex systems.

As will be seen as you progress through this Course, the GoldSim Contaminant Transport Module can certainly be used to build very complex models.  As discussed above, however, the key philosophy embodied in GoldSim is that in almost all situations, you should design your models from the top-down, keeping them as simple as possible, adding complexity only when it is warranted.  That is, you are encouraged to keep in mind what George Box told us at the beginning of this unit:

Since all models are wrong the scientist cannot obtain a "correct" one by excessive elaboration… Just as the ability to devise simple but evocative models is the signature of the great scientist, so overelaboration and overparameterization is often the mark of mediocrity.