mercredi 18 juin 2014

Expansion de la physique et consolidation des mathématiques (et réciproquement)

“The trouble with physics” is the title of an interesting and well-informed polemic by Lee Smolin against String Theory and present main stream physics at large. He notices a stagnation in physics, so much promise, so little fulfillment [Sm06, p. 313], a predominance of anti-foundational spirit and contempt for visions, partly related to the mathematization paradigm of the 1970s, according to Smolin: Shut up and calculate. Basically, Smolin may be right. Børge Jessen, the Copenhagen mathematician and close collaborator of Harald Bohr once suggested to distinguish in sciences and mathematics between periods of expansion and periods of consolidation. Clearly physics had a consolidation period in the first half of the 20th century with relativity and quantum mechanics... while, to me, the mathematics of that period is characterized by an almost chaotic expansion in thousands of directions. Following that way of looking, mathematics of the second half of the 20th century is characterized by an enormous consolidation, combining so disparate fields like partial differential equations and topology in index theory, integral geometry and probability in point processes, number theory, statistical mechanics and cryptography, etc. A true period of consolidation for mathematics, while - at least from the outside - one can have the impression that physics ... of the second half of the 20th century were characterized merely by expansion, new measurements, new effects - and almost total absence of consolidation or, at least failures and vanity of all trials in that direction. Indeed, there have been impressive successes in recent physics, in spite of the absence of substantial theoretical progress in physics: perhaps the most spectacular and for applications most important discovery has been the High Temperature Superconducting property of various ceramic materials by Bednorz and Muller - seemingly without mathematical or theoretical efforts but only by systematic combinatorial variation of experiments - in the tradition of the old alchemists, [BeMu87].

The remarkable advances in fluid dynamics, weather prediction, oceanography, climatic modelling are mainly related to new observations and advances in computer power while the equations have been studied long before. Nevertheless, I noticed a turn to theory among young experimental physicists in recent years, partly related to investigating the energy landscapes in material sciences, partly to the re-discovery of the interpretational difficulties of quantum mechanics in recent quantum optics.

La gravitation quantique : (péril physique ou) promesse mathématique (?)


When we write... of “unprecedented challenges, where the achievements of spacetime physics and quantum field theory are called into question” we are aware that large segments of the physics community actually are questioning the promised unified quantum gravity. We shall not repeat the physicists’ skepticism which was skillfully gathered and elaborated, e.g., by Lee Smolin in [92]. Here we shall only add a skeptical mathematical voice, i.e., a remark made by Yuri Manin in a different context [76], elaborated in [77], and then try to draw a promising perspective out of Manin’s remark. The Closing round table of the International Congress of Mathematicians (Madrid, August 22–30, 2006) was devoted to the topic "Are pure and applied mathematics drifting apart?" As panelist, Manin subdivided the mathematization, i.e., the way mathematics can tell us something about the external world, into three modes of functioning (similarly Bohle, Booß and Jensen 1983, [10], see also [13]):
  • (i) An (ad-hoc, empirically based) mathematical model “describes a certain range of phenomena, qualitatively or quantitatively, but feels uneasy pretending to be something more”. Manin gives two examples for the predictive power of such models, Ptolemy’s model of epicycles describing planetary motions of about 150 BCE, and the standard model of around 1960 describing the interaction of elementary particles, besides legions of ad-hoc models which hide lack of understanding behind a more or less elaborated mathematical formalism of organizing available data. 
  • (ii) A mathematically formulated theory is distinguished from an ad-hoc model primarily by its “higher aspirations. A theory, so to speak, is an aristocratic model.” Theoretically substantiated models, such as Newton’s mechanics, are not necessarily more precise than ad-hoc models; the coding of experience in the form of a theory, however, allows a more flexible use of the model, since its embedding in a theory universe permits a theoretical check of at least some of its assumptions. A theoretical assessment of the precision and of possible deviations of the model can be based on the underlying theory. 
  • (iii) A mathematical metaphor postulates that “some complex range of phenomena might be compared to a mathematical construction”. As an example, Manin mentions artificial intelligence with its “very complex systems which are processing information because we have constructed them, and we are trying to compare them with the human brain, which we do not understand very well – we do not understand almost at all. So at the moment it is a very interesting mathematical metaphor, and what it allows us to do mostly is to sort of cut out our wrong assumptions. If we start comparing them with some very well-known reality, it turns out that they would not work.”
Clearly, Manin noted the deceptive formal similarity of the three ways of mathematization which are radically different with respect to their empirical foundation and scientific status. He expressed concern about the lack of distinction and how that may “influence our value systems”. In the words of [13, p. 73]: “Well founded applied mathematics generates prestige which is inappropriately generalized to support these quite different applications. The clarity and precision of the mathematical derivations here are in sharp contrast to the uncertainty of the underlying relations assumed. In fact, similarity of the mathematical formalism involved tends to mask the differences in the scientific extra-mathematical status, in the credibility of the conclusions and in appropriate ways of checking assumptions and results... Mathematization can – and therein lays its success – make existing rationality transparent; mathematization cannot introduce rationality to a system where it is absent ...or compensate for a deficit of knowledge.” 
Asked whether the last 30 years of mathematics’ consolidation raise the chance of consolidation also in phenomenologically and metaphorically expanding sciences, Manin hesitated to use such simplistic terms. He recalled the notion of Kolmogorov complexity of a piece of information, which is, roughly speaking, “the length of the shortest programme, which can be then used to generate this piece of information ...Classical laws of physics – such phantastic laws as Newton’s law of gravity and Einstein’s equations – are extremely short programmes to generate a lot of descriptions of real physical world situations. I am not at all sure that Kolmogorov’s complexity of data that were uncovered by, say, genetics in the human genome project, or even modern cosmology data ...is sufficiently small that they can be really grasped by the human mind.” In spite of our admiration of and sympathy with Manin’s thoughtfulness, the authors of this review shall reverse Manin’s argument and point to the astonishing shortness in the sense of Kolmogorov complexity of main achievements in one exemplary field of mathematics, in spectral geometry to encourage the new unification endeavor.
Some of the great unifications in physics were preceded by mature mathematical achievements (like John Bernoulli’s unification of light and particle movement after Leibniz’ and Newton’s infinitesimals and Einstein’s general relativity after Riemann’s and Minkowski’s geometries). Other great unifications in physics were antecedent to comprehensive mathematical theory (like Maxwell’s equations for electro- magnetism long before Hodge’s and de Rham’s vector analysis of differential forms). A few great unifications in physics paralleled mathematical break-throughs (like Newton’s unification of Kepler’s planetary movement with Galilei’s fall low paralleled calculus and Einstein’s 1905 heat explanation via diffusion paralleled the final mathematical understanding of the heat equation via Fourier analysis, Lebesgue integral and the emerging study of Brownian processes). In this section, we shall argue for our curiosity about the new unification, nourished by the remarkable shortness of basic achievements of spectral geometry and the surprisingly wide range of induced (inner-mathematical) explanations.
Bernhelm BOOSS-BAVNBEK, Giampiero ESPOSITO et Matthias LESCH,