Let’s pretend your name is Jerome Powell. You have the difficult problem of figuring out how much to raise interests rates by. Whatever you decide, there will be profound and propagating effects throughout the economy, so choose wisely!
What monetary policy analysis tools do you have in your toolbox to help you? Until recently, you were reliant on large-scale macroeconometric models such as the ECB’s Area-Wide Model, which attempts to estimate dynamic correlations between macroeconomic aggregate quantities. That’s a lot of fancy speak for a giant correlation table that tracks what happens to some economic metric over there when you turn this knob over here. These cumbersome models have become less popular because they perform poorly in times of frequent economic regime change. While potentially useful for short term forecasting, these fancy correlation tables need to be recalibrated in a new economic regime.
We need a framework that behaves well during these regime changes. Instead of finding correlations between major macroeconomic indicators, what if we were to decompose these macroeconomic indicators into their core components. Then let’s further decompose those core components. At some point, we’re going to end up investigating the microeconomic decisions of individual people in our economy. If we were able to simulate how people behave in our economy, we could test economic policy by simulating the actions of individual people and aggregating those actions up into the macroeconomic indicators that we’re interested in. Fundamentally, instead of a top down approach, we’re going to use a bottoms up approach for modeling our economy.
Enter stage right the DGSE model. The key features of this class of models are aptly present in its acronym. These models are Dynamic, meaning that we care about the behavior of the model over time. Stochastic refers to random exogenous shocks built into the model that perturb the equilibrium conditions. General refers to the model’s `holistic’ treatment of the economy in its entirety, drawing a distinction from empirical modeling, which emphasized a `sum of parts’ approach combining many partial markets. Equilibrium refers to the model’s microeconomic foundations that enforce balanced market conditions.
So that’s a lot of fancy speak, Erick. Just tell me about the model.
Our goal is to simulate individual agent behavior in an economy. We track individual spending and saving behavior over time. Our agents have jobs where they make money based on their individual productivity. Their employment is governed by the overall health of the economy. They are more likely to be employed when the economy is booming and unemployed when the economy is busting. Additionally, the economy has market clearing conditions on wages and interest rate. This means that wages are inversely related to labor supply and interest rate is inversely related to capital supply. The goal of each agent today is to make optimal spending decisions accounting for tomorrow’s aggregate uncertainty.
At an abstract level, policy decisions such as interest rate and unemployment targets change the microeconomic parameters of an economy. We can predict the impact of these policy decisions by perturbing the input parameters to our model, simulating individual agent behavior, aggregating and propagating into economic aggregates. This type of sensitivity analysis is useful for tuning monetary policy.