Come smettere di preoccuparsi e imparare ad amare l’incertezza

Dall’Avvertenza di Probabilità: come smettere di preoccuparsi e imparare ad amare l’incertezza, Carocci, Città della Scienza, febbraio 2018.

Se andate di fretta, ecco il messaggio centrale:

Probabilità è cultura.

L’affermazione vi sorprende? Allora trovate un po’ di tempo per leggere ché questo libro è stato scritto per voi. E per tutte quelle persone che alla probabilità non pensano
spesso, o magari la associano a sondaggi elettorali
smentiti sistematicamente, o ancora al risultato di imperscrutabili algoritmi per la valutazione, spesso inaffidabile, del rischio finanziario — altro che cultura! 

Il suo scopo è invogliarvi a guardare da vicino alcuni aspetti centrali del ragionamento probabilistico e della cultura dell’incertezza che questo ci aiuta a costruire. Si tratta di una cultura di cui abbiamo un bisogno urgente. Perché l’incertezza è una componente ineliminabile della nostra vita, della società, della natura.
Probabilità: come smettere di preoccuparsi e imparare ad amare l'incertezzaAiutandoci a capirne alcuni aspetti fondamentali, la probabilità ci fornisce una grammatica per pensare ciò che non è, ma potrebbe essere, o per capire che le cose che conosciamo avrebbero potuto essere altrimenti – domande filosofiche profonde che trovano una formulazione particolarmente chiara nella matematica della probabilità. Si tratta di ragionamenti fondamentali alla comprensione scientifica del mondo, ma non solo. Sono necessari per prendere decisioni informate sulla nostra salute, sul nostro benessere e quello delle persone a noi care. Sono necessari alla partecipazione consapevole e attiva di ognuno di noi alla società, e in particolare all’assolvimento del nostro compito di vigilanza democratica dell’operato istituzionale.
Continue reading →

Forecasting in Light of Big Data

Predicting the future state of a system has always been a natural motivation for science and practical applications. Such a topic, beyond its obvious technical and societal relevance, is also interesting from a conceptual point of view. This owes to
the fact that forecasting lends itself to two equally radical, yet opposite methodologies. A reductionist one, based on the first principles, and the naive inductivist one, based only on data. This latter view has recently gained some attention in response to the availability of unprecedented amounts of data and increasingly sophisticated algorithmic analytic techniques. The purpose of this note is to assess critically the role of big data in reshaping the key aspects of forecasting and in particular the claim that bigger data leads to better predictions. Drawing on the representative example of weather forecasts we argue that this is not generally the case. We conclude by suggesting that a clever and context-dependent compromise between modelling and quantitative analysis stands out as the best forecasting strategy, as anticipated nearly a century ago by Richardson and von Neumann.

KEYWORDS: Forecasting, Big Data, Epistemology

H. Hosni and A. Vulpiani, “Forecasting in Light of Big Data”,  Philosophy and Technology (2017). doi:10.1007/s13347-017-0265-3

Preprint available here

The Reasoner, my first editorial

I just took over the editorship of THE REASONER from Jon Williamson who started it ten years ago. I’m very excited about this, and this is my first editorial. Read the whole issue at

Reasoning is naturally multi-disciplinary, inter-disciplinary, inter-sectoral. While those tend to appear as buzzwords in the narrative of funding agencies in Europe and elsewhere, reality’s bitterly different. Reasoners struggle a lot when the workings of academia demand comparison with more focussed areas, both in the Natural and the Social sciences. At that moment, our strength is likely to turn into our weakness. Community building and its consolidation are therefore no less than vital to us.

Continue reading →

Convex MV-Algebras: Many-Valued Logics Meet Decision Theory

This paper introduces a logical analysis of convex combinations within the framework of Lukasiewicz real-valued logic. This provides a natural, albeit as yet unexplored, link between the fields of many-valued logics and decision theory where the notion of convexity plays a central role.

We set out to explore such a link by defining convex operators on MV-algebras, which are the equivalent algebraic semantics of Lukasiewicz logic. This gives us a formal language to reason about the expected value of bounded random variables. As an illustration of the applicability of our framework we present a logical version of the Anscombe-Aumann representation result.

KEYWORDS: MV-algebra, convexity, uncertainty measures, Anscombe-Aumann

Flaminio, T., H. Hosni, and S. Lapenta. 2017. “Convex MV-Algebras: Many-Valued Logics Meet Decision Theory.” Studia Logica (Online First ) DOI: 10.1007/s11225-016-9705-9.

40 Years of Dempster-Shafer Theory

Originally published in The Reasoner Volume 10, Number 12– December 2016

The International Journal of Approximate Reasoning is celebrating 40 years of Dempster–Shafer theory with a very interesting special issue.

The opening sentence of the Editorial, by Thierry Denoeux, sets the tone

Among the many books published each year, some are good and a few are very good, but only exceptionally does a book propose a radically different way of approaching a scientific question, and start a new research field. “A Mathematical Theory of Evidence” by Glenn Shafer, which appeared in 1976, is one of those.


Continue reading →

The entire Hot Hand Fallacy is wrong!

Originally published in The Reasoner Volume 10, Number 9– Semptember 2016

The Rio 2016 Olympic Games have been, as usual, a great illustration of the Laplacian dictum according to which probability is partly due to our ignorance and partly to our knowledge. For the certainty that the best athlete(s) will win or, at the other end of the spectrum, the impossibility to figure out who is more likely to win, would turn the greatest sporting event on the planet into a very dull couple of weeks.

Laplacian romanticism notwithstanding, sport generates data, lots of it. Not only does this allow for high-tech betting, it also feeds uncertain reasoning research. In particular SportVU, a body-tracking system which builds on Israeli military technology, played an unexpected role in addressing a long-standing question: Is the Hot Hand phenomenon real, or is it just one of the many ways in which we tend to see patterns where there aren’t any?

Continue reading →

The Seven Pillars of Statistical Reasoning

Originally published in The Reasoner Volume 10, Number 8– August 2016
stiglerStatistician Stephen Stigler put forward in the 1980’s the amusing Law of Eponymy which bears his name(!). According to Stigler’s Law, the vast majority (some say all) of scientific discoveries are not named after those who actually made the discovery. Wikipedia lists a rather impressive number of instances of Stigler’s Law, featuring the Higgs Boson, Halley’s comet, Euler’s formula, the Cantor-Bernstein-Schroeder theorem, and of course Newton’s first two laws of mechanics. Of particular interest is the case of Gauss, who according to this list, has his name mistakenly attached to three items.

Rather coherently his recent book, S. Stigler (2016: The Seven Pillars of Statistical Wisdom, Harvard University Press), presents the fascinating edifice of statistics by giving more emphasis to the key ideas on which its foundations rest, rather than to the figures who came up with them. The seven pillars are: Continue reading →

Logic, Probability and the Foundations of Uncertain Reasoning

Here is the abstract for my invited talk at the Logic, Algebra and Truth Degrees 2016 which will be held from 28 to 30 June 2016 in South Africa

Logic, Probability and the Foundations of Uncertain Reasoning

The relation between logic and probability is a fascinating one. Leibniz saw them as two sides of the same coin, whereas Boole and De Morgan thought of probability as the natural generalisation of logic to reasoning under uncertainty. This cross-contamination rapidly declined with the coming of age of “mathematical logic” which reached its climax at the end of the 1920s. Around that time, Kolmogorov was providing the definitive answer to Hilbert’s sixth problem: the axiomatisation of probability. By then, the research agendas of mathematical logic and probability were showing virtually no overlap: a narrow focus on the foundations of mathematics for the former, and the assimilation to measure theory and the emerging theory of stochastic processes for the latter. At that point, neither logic nor probability were particularly concerned with uncertain reasoning and its foundations.

Things changed again during 1980s, when the sodality of logic and probability was revived as a consequence of the pressing needs of artificial intelligence and formal epistemology. In this talk I will provide (a naturally biased) account of how this is currently affecting the foundations of reasoning and decision making under uncertainty. After providing some standard back- ground, I will illustrate how the investigation of increasingly more expressive measures of uncertainty based on non-classical logics, and in particular many-valued, led to exciting research questions. I will conclude by briefly discussing how some of the ensuing answers provide interesting and to some extent surprising feedback on the classical notion of probability.