Interests - Entropy In ABMs
At some time in the year 2010, the students whom I tutored in independent studies requested that I develop a model economy so they could understand how economies function. With some reticence (as I never really felt I understood economics) I undertook a three-month programming exercise. We set our goal (and I set my personal goal) to understand the dynamics of a modern economy. We envisioned a little model economy with farms, coal and steel mines, factories, bazaars, oil wells, and transportation companies. We saw it as conforming to the laws of conservation of mass and energy, and recycling all goods. Within two months I had the model done, and it worked correctly but briefly, as it was totally unsustainable. Rampant inflation always brought an early end to all life. We scaled it back. We had to settle for farms, and no more. But even with that drastically scaled-down model in mind, we (my students and I) could not find a way to make it sustainable, that is, until we removed all price negotiations from commercial transactions. Like ignoring friction in a physical system, suddenly it all became simple and the simple farming economy flourished. We called that simple farming economy with the unrealistic Draconian price rules the PMM (Perpetual Motion Machine).
In studying the wealth distribution within the PMM, we discovered that it conformed to the Maxwell distribution of speeds of atoms in an idealized gas. The graph at right shows the normalized distribution of wealth for 1,000,000 agents (blue) and the best fit of the Maxwell speed distribution. It's a close enough fit to be interesting. And, so we were led to discover the work of Dr Yakovenko and his fellow Econo-Physicists, and the gas-theory of economic wealth distributions.
Since then, I have made contact with Dr Yakovenko and he has been kind enough to provide me with some incredibly useful feedback and guidance from time to time. It also led me to discover the works of C.A.S. Hall on EROEI (energy returned on energy invested).
So, here I am several years later, my students graduated and gone, but I am still trying to understand how modern economies function. Along the way I have come to believe (have strong opinions about) certain things that will explain the variety of notes, draft papers, and other documents you will find on this page, and on other related pages.
THINGS I HAVE COME TO BELIEVE:
- BELIEF NUMBER ONE: The absolutely best way to really study the dynamics of modern economies is via well-constructed agent-based models (ABMs). There is some credible research supporting this belief. However, there is little effort being spent to develop the needed ABMs that would be the tools for such study. So, my three-month project has turned into a one-person crusade to convince others that such models need to be built. (I confess there has been little success or progress there.) My ModEco software is available from my site here, or from the OpenABM site, or from the NetLogo Modelling Commons site.
- BELIEF NUMBER TWO: The absolutely most important feature of modern economies that must be understood to advance modern economic theory is the ubiquitous and immense role that entropy plays in economic growth and decay. The role of energy is reasonably well understood, now, due to the fundamental work of C.A.S. Hall and his colleagues who sometimes refer to themselves as "Biophysical Economists". The role of entropy, the other member of the energy-entropy pair of twins, has gone relatively unexplored.
- BELIEF NUMBER THREE: Entropy is a ubiquitous and as active in economic ABMs, such as those capital exchange models designed by Dr Yakovenko, or in my PMM, as it is in energy systems, and as it is in modern economies. But, it is much easier to study entropy in ABMs. (E.g. see my implementation of entropic indices in figures 6 and 7 of ModEco (NL).)
Diary notes, documents and draft papers
Legend: NTF means 'Note to File', and is a diary note. I typically write them and then revise them later as I learn more, or not. PPR means the document is in the format of a formal paper, unpublished. XLS means the file is an MS Excel file. ODD means the document is a formal model description using the ODD protocol.
I got a copy of the book "Classical Econophysics" of which Dr Yakovenko was one of the authors. These are my somewhat naive notes taken as I read the book, in the very early days of this journey.
I started writing this draft paper in January of 2013. It's in its 12th revision, and not yet done. Sorry. Still, I think it is really worth reading. So, I offer it here. (E.g. see my implementation of entropic indices in figures 6 and 7 of ModEco (NL).)
In it I explore the nature of entropy in Model I of EiLab. Model I is a simplification of one of Dr Yakovenko's capital exchange models. I develop a formula for the entropic index of a histogram, I verify some theoretical calculations by comparison with actual output from the model, and I explore entropy as the source of the 'arrow of time' in this extremely simple ABM. If you do nothing else, look at the cool chart in figure 03 below (page 22 in the paper).
Figure 01 - The states of H(4,8)
The graph to the left is the set of all possible states (points) in all possible state spaces (i.e. all possible initial endowments of wealth, each forming a column) of an extremely simple capital exchange model. The position on the y range is the entropic index of that state. For example, if the model has an initial endowment of $20, the middle column of points, with a maximal entropic index of 1.0, has exactly 13 possible states, represented by seven visible points in a column. Six are in pairs having the same entropic index, and hide each other.
Why? Suppose you have a histogram with four bins (K=4) and eight agents (A=8). Agents can have wealth of only $1, $2. $3 or $4. Then there are only 13 ways to allocate the wealth such that the sum of their wealth is $20. Those 13 variations on allocation of wealth are the 13 allowed states of the model, with an initial endowment of $20. Such a model is designated as H(K,A,W)=H(4,8,20), and it is represented by the middle column of points. Each 'state' is a histogram, or a configuration, an is given a configuration number. For example, histogram number 69 is h(2,2,2,2), the equilibrium state, and the most likely state in which the model is to be found.
Figure 02 - Expected behaviour vs Observed behaviour
In the (draft) paper I have (a) described the computer model, and presented the stochastically determined data from a run of the model, and (b) developed an analytic technique to predict the probability that the H(4,8,20) model will be in each state. Each of the thirteen states was given a 'configuration number'. Note that you can see that they come in pairs, but the probability that a particular state will be exhibited can vary within a pair.
Figure 03 - The Transition Map of H(4,8,20)
Entropic Index is on the Y axis. Note that states come in pairs having the same entropic index (e.g. 96 and 49). Arrows connect transition pairs (i.e. states between which transitions are possible due to the rules of the model). States with low entropy can transition to states with higher entropy, and vice versa. However the probability of transition is always asymmetric, The probability is that the entropy of the system will rise and remain high, but it does, on occasion, return to the bottom states. Note that for some downward paths, there is a cul-de-sac. The arrow of time comes primarily from the asymmetric probabilities of transition, but also from the effects of the green culs-de-sac.
Dr Yakovenko suggested that some of my problems may be due to the use of Stirling's approximation for ln(N!). This is my first foray into exploring the implications of using Stirling's approximation vs other means of computing entropy in an ABM.
This note brings together some ideas from previous notes and develops the mathematical formulas for use in computing entropy in ABMs with as much formalism as I can manage. I come to the conclusion that a one-formula-for-all-purposes is not very workable.
This is the final note in the sequence, and an important note, I think. I attempt to solve the problem discovered as I wrote the previous note. The problem goes like this:
- To compute the entropic index in an ABM you need to calculate ln(N!) for N valued from 0 to several hundred, both natural numbers and real numbers.
- Excel can only handle N<170.
- The factorial formula can only handle integers.
- And Stirling's approximation only works well for N>30 or so.
- I want a routine that can be implemented both in C++ in a model, or in NetLogo, or on Excel, and is accurate over the needed domain.
- GammaLn() is a function that seems suitable, but it is not available as a default function in all ADEs, so I explore how to make it available. In particular, the Lanczos' approximation of GammaLn() looks workable.
Here are two spreadsheets that go with this note:
In this document I try to bring together a far-flung bunch of speculative ideas on how entropy might be playing a role in a variety of non-informational non-thermodynamic systems. I have come to view entropy as a purely mathematical measure of histograms, which displays a tendency to rise if the histogram is of a conserved quantity in a dynamic system. This is an attempt to think through the implications of that idea. In the slide below, I am contrasting entropy as a measure, and mean as a measure.
Here are a couple of spreadsheets and a powerpoint slide that go with this note:
PSoup (available from this site) is my application in which students can learn about Darwinian natural selection. In 2008 I attempted to show that an entropy-like concept was active in that application, and failed, due to lack of understanding of what entropy is. I think I understand it much better now, I have not yet implemented entropy calculations in that application, but I have gone back to my old NTF and updated it with my new understanding. This is the first step towards that implementation.
In this note, which is now over 20 pages in length and has its own table of contents, I have reconsidered the questions of just what, exactly, do I mean by energy and entropy? This was a line of thinking that was started by my discussions with Dr Yakovenko in early 2014. I suppose it is, in some way, background information for all of the other notes on this page. Here is a copy of the table of contents:
Table of Contents
1 References 1
2 Background 4
3 Purpose 4
4 Energy 5
4.1 Energy as an accounting system applied to a biophysical system 6
4.2 Common units 6
4.3 Energy is additive across systems, or subsystems 6
4.4 Conservation of energy 7
4.5 Forms of energy 7
4.6 Transformations of energy 7
4.7 Localization of energy and the issue of system scale 10
4.7.1 Localization of kinetic energy 10
4.7.2 Localization of potential energy 10
4.8 Grades of energy 11
5 Entropy 12
5.1 Entropy as an accounting system 13
5.2 Entropy as an average 13
5.2.1 Entropy of a Histogram – 14
5.2.2 Entropy of a static system – 15
5.2.3 Entropy of a dynamic system – 15
5.3 Forms of Entropy 15
5.3.1 Entropy as thermodynamic entropy 15
5.3.2 Entropy as informational entropy 16
5.3.3 Entropy as economic entropy 16
5.3.4 Entropy as ABM entropy 16
5.4 Units of entropy 16
5.5 Entropy is not additive across systems, or subsystems 16
5.6 Non-Conservation of entropy 16
5.7 Entropy as the arrow of time 17
5.8 Transformations of entropy 20
5.9 Localization of entropy 20
5.10 Grades of entropy (and energy?) 20
6 Summary 21
7 Material Added 150101 21
8 Final thoughts 23
Last updated: January 2015.