Logic, computation, and normativity PICS 2014
Projet Logic, computation, and normativity PICS 2014
Tokyo-Paris coopération 2015-2017
It is usually assumed that logic is a normative science, as opposed to a descriptive science such as physics or biology. But what does that mean exactly? The answer will depend crucially on what logic actually is, and because modern logic has evolved dramatically from the time of Frege’s Begriffsschift (1879) to the present day, Frege’s classical conception of logical normativity cannot be ours anymore. This project is premised on the basic idea that the relationships between logic and computer science – more precisely: between logical proofs and computational programs – raise new issues about the very meaning of the normativity of logic.
In order to set the issue, it will be useful at the outset to distinguish three conceptions of the normativity of logic, starting with Frege’s conception. Our contention is that none of these three conceptions provide a satisfactory answer to the issue of normativity for contemporary logic; our goal is to set up an interdisciplinary team of philosophers, logicians, and specialists in theory of computation in order to work on this precise issue.
(1) The Fregean conception of logical normativity
According to Frege, the normativity of logic means that the laws of logic do not describe how people actually think; they give prescriptions about how people should think. Frege’s point is not so much that people should abide by the laws of logic if they want to think correctly; his point is that those who do not respect the basic laws of logic do not think at all. Someone who would persistently not follow the rules of logic could well utter sentences or have various mental representations; but this could not count as conceptual thinking. For Frege, the laws of logic are the laws of being-true and the normativity of logic is constitutive of thought (as in the case of a game: those who do not follow the rules of the game just do not play it); it is not to be conceived as the regulation of an already existing activity (as in the case of the rules for traffic regulation in the streets).
This does not mean, however, that the laws of logic as Frege conceives them do not also have a descriptive content: they are descriptive in so far as they describe the most general objective features of the world. This Fregean conception of normativity is connected with two features of logic as Frege conceives it: logic is universal, and it articulates the laws of being- truth. Frege’s universalism is a kind of essentialism about truth.
(2) Normativity and classical logical systems Frege’s logic is a logic of judgment (an inference is a relation between judgments, not propositions or sentences). In later years, classical mathematical logic gives up the analysis of the assertive force of judgment, and with it, the Fregean conception of the normativity of logic.
After Hilbert’s program, and his idea of a formal system and a metamathematical analysis of proofs, a logical system is defined as a system of signs with rules of formation and rules of inference. The rules are carefully chosen and this choice is not dependant on the inconceivability (or absurdity) of any kind of non-logical conceptual thinking anymore. The normativity of logic is still dependant on the idea of truth, as can be seen in the Tarskian definition of the relation of logical consequence; but truth is defined for systems of objects or of propositions which are conceived as independent of any relation of epistemic accessibility.
The normativity of logic – be it syntactical or semantical – is defined in terms of the properties of logical systems (coherence, completeness, compactness, normalization of proofs, etc.); not in terms of reasoning or thinking.
(3) Constructivism and normativity The intuitionistic and constructivist criticism of classical logic (Kolmogorov, Brouwer, Heyting) induces a new conception of logical normativiy. The question is not anymore whether proposition P can be proved at all, or whether there exists at least one proof of proposition P; the question now is which principles are used in the proof of P and what the structure of its proof is. New norms are formulated according to which kinds of proofs are distinguished. Some proofs are accepted and others rejected. All proofs are not equally correct, and a new interpretation of old proofs is provided. A norm of constructivity is introduced, and logical norms are not independent anymore of the issue of epistemic accessibility. Roughly speaking, the issue of meaning is connected with logical normativity in the following way: the meaning of sentence P is not defined by its truth conditions; it is defined by the possibility of providing a proof of P. As for the logical constants, their meaning is given by logical rules which characterize their possible use in a proof. In this way, logical normativity is linked to some kind of verificationism.
With these three conceptions of normativity in mind, the issue of normativity for contemporary logic can now be raised. One of the most important features of contemporary logic is its connection with computer science. This connection can be conceived from several viewpoints: computation and computability theory, Curry-Howard isomorphism between proofs and programs, new interpretation of logical constants (as in linear logic), etc. These connections with computer science induce new issues about the normativity of logic, either from the computer science or from the logical viewpoint. For example, when a computer program is written, we want to be able to prove that the program is correct, i.e. that the program does exactly what it was meant to do. In the case of the Curry-Howard isomorphism, the connection between logic and computing has an even deeper form: the very concept of proof corresponds to programs. Now, in this view of logical proof-as-program, the ideas of normalization and of normal proof play a crucial role, and this is the reason why the very idea of normativity – which has not been thoroughly investigated by philosophers from this precise viewpoint – is in need of further analysis.
The goal of our project is to provide a new analysis of logical normativity from the contemporary standpoint, where the notion of “normal” proofs becomes central. The idea of a normal proof was originally introduced by the Hilbert School for carrying out Hilbert’s Consistency Program, under the early influence of Husserl, and was later revived with Martin- Löf’s Intuitionistic Type Theory, with further developments through the Curry-Howard isomorphism. Today, the notion of a normal proof turns out to be essential to our understanding of logic, especially in the combined proof-theoretic and computational views of logic which is so central in contemporary logic and computer science. Recent research in proof theory puts the relation between proofs and computation in a new perspective and gives it a new twist, which makes it deeper than before. Linear logic, ludics, and further developments show that the notion of normal proofs is essential for various settings of logic.
The notion of normalization appeared in very important places, including computational theories, and among them, e.g. compositionality of game semantics. A typical example is the linear logical introduction of various representations of proof-normalization, with the distinction between classical logic and intuitionist logic being given a new sense, and the classical distinction between the semantics and syntax being critically examined. These new aspects of logic as connected with computation and programs call for an updated concept of logical normativity.
Our contention – and philosophical motivation – is that it is crucial to include the idea of norms of proofs, in addition to norm of logic, in order to understand the whole picture of contemporary logic and the new views on proofs, including the current interactions with logical theories of computation.
Our methodology will include an investigation of the very idea of logical normativity from several viewpoints: historical, technical (computational and logical), and philosophical.
Form an historical perspective, we shall study and clarify more deeply the various conceptions of normativity, including the three ones sketched above, in order to arrive at a better understanding of their differences and, hopefully, at new and more fine-grained distinctions. We shall evaluation the impact of thinkers – logicians, mathematicians, and philosophers – such as Frege, Husserl, Heyting, Wittgenstein, Carnap (because of his position with regard to logical universalism and pluralism), or Martin-Löf, on this issue. From a conceptual and technical (computational and logical) viewpoint, we shall examine and discuss the contemporary understanding of normal proofs, normalization as a process, and their connection and differences with the former notions of logical normativity. We shall pay a particular attention to the computational constraints on questions of norms, and the view point of computer scientists as well as logicians and philosophers will be crucial to our investigation.
The Japanese and the French teams have strong and complementary backgrounds in philosophy, logic (especially proof theory), history of logic and theory of computation. Each member of the French team already had the opportunity to collaborate with Mitsuhiro Okada, the scientific leader of the Japanese team (see the details below on past collaborations).
Moreover, it is quite clear that the collaboration of Japanese and French researchers, with their respective backgrounds, will be especially valuable for such a topic as normativity.
Researchers involved in the project:
1) Global Research Centre for Logic and Sensibility, Keio University, and Department of philosophy, Keio University
- Full members: Ryota Akiyoshi, Yasuhiro Arahata, Tatsuta Kashiwabata, Mitsuhiro Okada, Yuta Takahashi.
- Invited collaborators: Koji Mineshima (simulation science, Ochanomizu University), Kazuyuki Terui (mathematical institute, Kyoto University). 2) IHPST, UMR8590, and Philosophie Contemporaine (université Paris 1)
- Full members: Jocelyn Benoist (Phico), Jean Fichot (IHPST), Marco Panza (IHPST), Alberto Naibo (IHPST), Mattia Petrolo (IHPST), Pierre Wagner (IHPST) + Meven Cadet (PhD student, IHPST).
- Invited collaborators: Philippe Codognet (Japanese French Laboratory for Informatics, Tokyo), Jean-Baptiste Joinet (university Lyon 3).