Post's "Why the properties of individual agents may be ignored?" thumbnail

Human behavior is perplexingly complex. Why their collective behavior is so well described by rather general mathematical models using very few parameters? Why do they not need deeper insight into the human psychology or decision making? One of the simple answers – if the statistical signature is not present in the data, which is usually aggregated at least to some degree, we can do nothing about it. Namely, usually we do not observe individuals making decisions and as such we are will not be able to differentiate between different mechanisms of human decision making. There are two major mechanisms of human decision making – homophily (selecting your peers) and peer pressure (adopting your peers behavior). Mathematically there usually will be no difference between them, both mechanisms can be be described using the same Kirman’s model.

In this text we will consider Bass diffusion model with heterogeneous agents (each of them having his own independent parameters). We will show that the heterogeneous model produces similar macroscopic dynamics as homogeneous model. To simplify matter even further we will use unidirectional Kirman’s model. Continue reading “Why the properties of individual agents may be ignored?”

Post's "AB model" thumbnail

Let us now return to the Voter model. In the original model we had agents occupying two possible states. They chose their state simply by copying the choice made by their neighbors. Yet in most elections around the world more than two parties compete for the electoral vote. Furthermore it is hardly believable that any established supporter of any party would switch to following the opposing party over night. One way to account for these zealous supporters would be to introduce “agents with fixed state.” Yet some strongly opinionated individuals do changes their beliefs, thus this would not be an ideal solution. Alternative approach was considered in [1]. In this paper a three state model is proposed, where the third state serves as intermediate stop for the agents switching between the two main states. Continue reading “AB model”

Post's "Epstein’s riot model" thumbnail

Previously discussed Granovetter threshold model is just one of the numerous simple collective action model. This time we continue the same topic by considering another, yet a bit more complex, riot model, which was proposed by Epstein in [1]. This model is rather interesting in a sense that it is not static as original Granovetter model is. It has interesting temporal dynamics builtin. In a recent paper by British mathematicians [2] this model was applied to explain the patterns observed in 2013 London riots. So let us see… Continue reading “Epstein’s riot model”

Post's "Granovetter threshold model" thumbnail

Here on Physics of Risk we once again present you a model of collective action. Last time we have considered Standing Ovation Model by Miller and Page, in earlier years we have written a lot about Kirman and Bass models, as well as correspondence between them. There is another classic model, which will be covered in this post, which models human intention to join the collective political action with inherent risk. In this text we will consider a threshold model proposed by Mark Granovetter. Continue reading “Granovetter threshold model”

Post's "Standing ovation model" thumbnail

It has been a long time since last interactive model on Physics of Risk. This time we return to a problem we have already previously considered, but which we did not model.

From time to time almost everyone of us has an opportunity to go see a play. Afterwards everyone has to make a choice – to applaud or not to. It appears to be free choice, but actually it isn’t as there are various social feedback loops in play. This problem was considered as a simple agent-based model in a paper by Miller and Page [1]. In this text we will briefly introduce you to it. Continue reading “Standing ovation model”

Vast majority of scientific research begins with an idea how the world works according to the proposer. The proposer formulates his hypothesis and tries to prove it using scientific method, usually checking his experiments or observations using varying statistical tools. These tools are used to process the collected data and either confirm his initial hypothesis or to reject it in comparison to the alternatives.

One of the methods is the so-called critical value approach [1]. This method relies on the researcher to set a precision standard to the statistical test and accept or reject hypothesis based on it. Usually different branches of science have their own set of rules how small the error could be tolerated. For example in life sciences it is common to see that most of published papers report statistical significance of \( p<0.05 \) (meaning that probability of error is less than \( 5\% \)), while in physics it is rather frequent to hear about the precision of \( 5 \sigma \) (probability of error is less than \( 5.7 \cdot 10^{-5} \% \)).

From the first glance it appears that the methods lacks drawbacks. But in the context of current science publishing tradition – mostly positive results being published – the drawbacks are evident. All statistical methods rely on numerous samples being made – so in order for these kind of test to work numerous independent groups should repeat the same experiment and obtain similar conclusion. Otherwise there is a significant possibility of a positive result being just a successful fluke. Having in mind pressure to publish more pressure there is also a risk that the same research group would repeat the same experiment until getting the desired statistical significance (waiting for a fluke to happen).

I did my best to enlighten you to this problem, but there is a rather significant chance that Hank Green will do better in this SciShow video I invite you to see.

For the ones who are more interested in technical detail I would like suggest reading a draft by Nicholas Nassim Taleb [2].