Image de couverture
Image d'atome
Séminaire

Séminaire HistPhilPhys

Nous aurons le plaisir d'écouter : 

Chen-Pang Yeang (Univ. de Toronto)

Titre : Instruments et simulations numériques 

« Transforming Noise: A History of Its Science and Technology from Disturbing Sounds to Informational Errors, 1900-1955 »

Résumé : this talk, I examine the historical origin of the attempts to understand, control, and use noise at modern times. Today, the concept of noise is employed to characterize random fluctuations in general. Before the twentieth century, however, noise only meant disturbing sounds. In the 1900s-50s, noise underwent a conceptual transformation from unwanted sounds that needed to be domesticated into a synonym for errors and deviations on all kinds of signals and information. I argue that this transformation proceeded in four stages. The rise of sound reproduction technologies—phonograph, telephone, and radio—in the 1900s-20s prompted engineers to tackle unwanted sounds as physical effects of media through quantitative representations and measurements. Around the same time, physicists developed a theory of Brownian motions for random fluctuations and applied it to electronic noise in thermionic tubes of telecommunication systems. These technological and scientific backgrounds led to three distinct theoretical treatments of noise in the 1920s-30s: statistical physicists’ studies of Brownian fluctuations’ temporal evolution, radio engineers’ spectral analysis of atmospheric disturbances, and mathematicians’ measure-theoretic formulation. Finally, during and after World War II, researchers working on the military projects of radar, gunfire control, and secret communications converted the interwar theoretical studies of noise into tools for statistical detection, estimation, prediction, and information transmission. In so doing, they turned noise into an informational concept. Since the grappling of noise involved multiple disciplines, its history sheds light on the interactions between physics, mathematics, mechanical technology, electrical engineering, and information and data sciences in the twentieth century.

  • Arianna Borrelli (Technische Universität Berlin)

« Signal and noise before the black-box: The first computer-assisted data analysis in high energy physics (1952–1954) »

Résumé : Distinguishing signal from noise has long been a central problem of data analysis in high energy physics experiments, where a growing amount of background data hides a diminishing number of rare events to be studied. Computational tools have been developed to help in data analysis since the 1950s, and today a broad range of computer-assisted methods are deployed to distil the signal from the noise. These methods function like black boxes which, once sufficiently tested, enjoy the full trust of physicists.

In my presentation, I step back to a time when those black boxes did not exist yet and in high energy physics there was a scarcity, rather than abundance, of data from which to extract information. It was then that, for the first time, high energy physicists asked a computer for help. Between 1952 and 1954 a small group of computer-affine scientists including Enrico Fermi, Nicholas Metropolis and Hans Bethe used the MANIAC machine recently built at Los Alamos to try and assess whether experimental data contained evidence of a new microphysical phenomenon, a so-called “resonance.” Unlike today’s scientists, those authors freely combined computational algorithms with graphical methods, aesthetic criteria and “theoretical intuition,” and in the end could not agree on whether the phenomenon was there or not. I will argue that taking a closer look at these practices and at the resulting questions allows one to better understand the approaches and assumptions that would later flow into the black boxes.


La séance se tiendra en mode hybride. Voici le lien Zoom pour vous connecter :

https://u-paris.zoom.us/j/83329129991?pwd=DkDNti9bVza57cqb5BoIQg480rsuQI.1