Over the next few decades, concerns over climate change and energy security will drive fundamental changes in the global supply, transport, and use of energy. Energy system and integrated assessment models play a vital role in the planning process by providing insight related to the future impacts of policies and technology deployment. These models can be generically referred to as energy-economy optimization models, and they minimize cost or maximize utility by, at least in part, optimizing the energy system over multiple decades. Energy-economy models encoded with a set of structured, self-consistent assumptions and decision rules have emerged as a key tool for the analysis of energy and climate policy at the regional, national, and international scale.

Personal observation and a survey of the literature have led us to identify two critical shortcomings regarding existing energy-economy optimization models:

  • With a couple exceptions, the models are not open source. Because EEO models necessarily have long timeframes, expansive system boundaries, and encompass both physical and social phenomena, the level of descriptive detail that can be provided in model documentation and in peer-reviewed journals is insufficient to reproduce a specific set of published results. While there have been rigorous efforts to compare model results (e.g., Stanford Energy Modeling Forum), the lack of access to source code prevents a deeper level of external verification. Making executable models available without source code would be a step in the right direction, but it still does not allow the user to investigate how the underlying system of equations and data lead to specific outcomes.
  • Treatment of uncertainty is often absent or cursory. Much effort is expended trying to build larger, more complex models that endogenize observed phenomena. But because there is no practical method to validate the model results, there is little to reign in efforts that do not improve model performance. The size and complexity of many energy-economy models makes it difficult to iterate them, and often policy analysis is reduced to a few highly detailed, illustrative scenarios. Model time horizons on the order of 50 to 100 years necessitate a rigorous exploration of the decision space to ensure that model insights are robust to large future uncertainties. While there is no practical solution to the validation problem, modelers should be cognizant of creep in EEO model complexity. While judgments about the appropriate level of model detail and sophistication are subjective and should be tailored to specific research objectives, insights generated with EEO models must — to the degree possible — be robust to large future uncertainties. If not, they are of little practical value to policy planners and decision makers.

TEMOA will address these issues by combining expertise in energy modeling, economics, operations research, and computer science to create a new suite of modeling tools.

                                             Research Objectives

  • Build an open source technology explicit energy economy optimization model. We provide public access to our revision control system via the web. We also plan to take snapshots of model source code and data used to produce published model-based analysis, enabling third party verification of our work.
  • Utilize open source software tools wherever possible. This includes the programming language, database, graphing and visualization tools, and solvers. This makes the TEMOA model broadly accessible. The biggest challenge is the choice of solvers: the best linear solvers (e.g. CPLEX and Gurobi) are commercial products.
  • Design the model to make uncertainty analysis more tractable. We plan to develop scripts that automate sensitivity analysis, Monte Carlo simulation, multi-stage stochastic optimization, and search techniques to find near-optimal alternative solutions.
  • Utilize multi-core and compute cluster environments to perform rigorous uncertainty analysis. Given the steep drop in computer hardware, running code in an embarrassingly parallel fashion to enable uncertainty analysis has become a practical and cost-effective option.