logo_pse

2nd Workshop of the ESF Teams A (“Formal Methods”), D (“Physical Sciences”) and E (“Foundational and Methodological Debates”)
Utrecht University
19-20 October 2009

Abstracts Team A

Franz Dietrich (University of Maastricht)
Modelling reasons underlying preference 
(based on the working paper "A reason-based theory of rational choice" by F. Dietrich & C. List)

The standard rational choice paradigm explains an individual’s prefer- ences by his beliefs and his fundamental desires. For instance, someone’s preference for joining the army might be explained by certain beliefs about what life in the army is like and a desire for such a life. When the paradigm is spelled out formally, the objects of preferences (such as possible professions) are usually ranked according to their expected utility, derived using a probability function reflecting current beliefs and a utility function reflecting never-changing fundamental desires. In consequence, changes in preference can result only from changes in belief, not from any fundamental changes in desire or motivation. This standard paradigm violates the intuition of many and is frequently criticised. One shortcoming is that reasons and motivations play no explicit role. Some of the more fundamental preference changes that one can undergo seem to reach beyond information-learning and to involve a change in the reasons or goals by which one is fundamentally motivated. Such changes of motivating reasons may come in connection with a changing ability to abstractly represent certain aspects of the world (like the thirteenth move in a game) or to imagine certain qualitative aspects of the world (like feelings of complete loneliness). But standard rational choice models do not, or at least not explicitly, address these phenomena. Rather, as one can argue, they implicitly assume away limitations or changes in conceptualisation (by identifying the individual’s representation of the world with the modeller’s) or in imagination (by using invariant fundamental desires). Criticisms of rational choice theory often suffer from not offering a formal alternative. The literature has of course modelled several bounded forms of rationality; these are important in their own right, but they usually take the expected-utility paradigm as their starting point and introduce certain deviations from it, without introducing reasons or motivations into the model. This paper proposes a formal reason-based model of preferences. The model explains an individual’s preferences by the set of reasons that motivate him. The preference of our example individual for joining the army would be explained by the set of reasons that motivate him, such as service to his country, an athletic body, and comradeship. Preference change in our model thus stems not exclusively from new information but often also from a change of the set of motivating reasons. If our example individual suddenly loses his preference for joining the army and joins a charity, new reasons (such as worldwide justice) might have become motivating while others (such as an athletic body) might have lost their motivational power. Our notion of a ‘(motivating) reason’ is essentially technical, and as such open to different interpretations and applications, like ones related to conceptualisation or imagination abilities. We formulate two natural postulates on reason-based preferences, the first ensuring that preferences are determined by the motivating reasons and the second ensuring that preferences change in a coherent way as additional reasons become motivating. These two postulates are shown to imply a simple representation of individual preferences: a single binary relation (which ranks the consistent reason sets) is sufficient to generate all individual preferences across possible individual states (i.e., possible sets of motivating reasons).
Although our model of preferences looks very different from classical rational choice models — for instance, we do not generally reduce preference change to belief change — a valid question is whether our model simply re-describes classic informational preference change in different terms. More precisely, can a change in motivating reasons (as it occurs in our model) be reduced to a change in information in some suitably enriched model? Indeed, an orthodox critic might immediately suspect that our reason-based model of preference change describes a macroscopic phenomenon whose microscopic origin is an informational change; and he might attempt to formulate a classical rational choice model to which our model can be reduced (just as cooperative game theory can to some extent be reduced to non-cooperative game theory). Such a classical reduction of our model — if it is possible — would make our model less of a departure from orthodox rational choice theory (without making it redundant, since a macroscopic perspective has its own merits, such as parsimony-related merits). We briefly engage with this important question, arguing against classical reducibility. We argue that our reason-based preference change model can instead be reduced to non-orthodox models of phenomena such as change of conceptualisation or imagination abilities. Our model is a macroscopic model that refrains from explaining why some reasons motivate and others don’t; this is on purpose, in part because the model should not be committed to one of the potential microscopic explanations.
We do by no means intend to deny that many preference changes are information-driven, or to abolish the standard rational choice model. A comprehensive rational-choice theoretic model would explain someone’s preferences by two factors, information and motivating reasons. While the present paper’s model captures only the latter and standard rational choice models capture only the former, it is possible to merge both approaches to a comprehensive model of preference formation and change, as only briefly discussed at the end.

Maria Carla Galavotti (University of Bologna)
Probability and Pragmatism
It is often claimed that pragmatism is both a philosophical movement and an attitude, or a method as Peirce and James put it. On the one hand, the philosophers who are commonly called “pragmatists” did not have much to say about probability. A notable exception is Charles Sanders Peirce, who is considered the beginner of the “propensity” interpretation. While Peirce’s ideas had a strong influence on Ernest Nagel’s views on probability, the propensity theory was later carried out on somewhat different grounds by Karl Popper. On the other hand, the pragmatist attitude affects in various ways the debate on the foundations of probability. A distinctive pragmatist flavour imprints the subjective interpretation, as admitted by its “fathers” Frank Ramsey and Bruno de Finetti. But even the writings of upholders of different interpretations of probability like the frequentist Hans Reichenbach and the logicist Harold Jeffreys disclose strong analogies with the pragmatist outlook. The paper will address the impact of pragmatism on the debate on the foundations of probability.

Adam Grobler (Opole University)
An Explication of the Use of IBE
The explication on offer will be given in terms of two set-theoretical criteria of comparing the relative explanatory power of alternatives. One criterion is designed to compare rival hypotheses put forward in the framework of a fixed background knowledge, the other is designed to compare an original background knowledge with its attempted revision. The proposal will be claimed to resolve the problems of Duhemian variety as well as the incommensurability problem.

Joke Meheus (University of Ghent)
Formal Logics for Abduction
The aim of this paper is twofold: (i) to distinguish between different types of abductive reasoning and (ii) to show that each of these types can adequately be captured in terms of so-called adaptive logics.
Adaptive logics for abduction have several nice properties. First, they nicely integrate deductive and abductive steps, and moreover have a decent proof theory. This proof theory is dynamic, but warrants that the conclusions derived at a given stage are justified in view of the insight in the premises at that stage. Next, they have a nice semantics that is sound and complete with respect to the dynamic proof theory. Finally, they are much closer to natural reasoning than other available systems (for instance, logic-based approaches for abduction in Artificial Intelligence).
After briefly discussing the basic ideas behind adaptive logics for abduction, an overview will be given of the different application contexts for abductive reasoning. It will be shown, for instance, that in some application contexts one searches for a unique abductive explanation for each explanandum (the logically weakest one), whereas in others one looks for alternative explanations for the same explanandum. For each of the application contexts, a suitable adaptive logic will be presented (both the semantics and the proof theory).

Thomas Müller (University of Utrecht)
Probability in a branching framework
There is a broad consensus that probabilities are graded possibilities. Formally, this is reflected by the fact that a probability space is a measure on a Sigma-algebra over a sample space. Conceptually, any framework involving probabilities has to presuppose some notion of possibility or other.
In my talk I will consider probability theory built upon two frameworks for branching possibilities: Prior-Thomason branching time and Belnap's branching space-times. Branching-time based probabilities resemble the well-known consistent histories in quantum theory. Branching space-time based probabilities are more general. In that framework, we can give a spatio-temporal motivation for independence assumptions and for the Markov condition. Furthermore, we can prove an interesting variant of the common cause principle.

Stathis Psillos (University of Athens)
The Limits of the No-Miracles Argument
The explanationist defence of realism is tenable (but more limited in scope thant I initially thought). In this paper I will present the no-miracles argument as a two-stage proper ampliative argument in defence of inference to the best explanation and show that the obvious circularity implicated in this defence is harmless—especially if it is placed within an evaluative approach to epistemology. I will argue that, nonetheless, the no-miracles argument should not be seen as a global argument for the adoption of the realist framework, but as an argument within it. There is no way, I will claim, to have anything like a robust realist view of scientific theories without some (quite heavy) reliance on explanatory considerations and ampliative inferences that should be seen as species of an inferential genus: inference to the best explanation. But the pertinent explanatory considerations can be varied and more local in character.

J.W. Romeijn (University of Groningen)
How to interpret Hypotheses about Chances?
Statistical inference typically starts off with a hypothesis about chance, or a set of such hypotheses collected in a statistical model. On the one hand, such hypotheses are statements about the data, which can presumably be true or false. On the other hand, they are expressed in probability functions over sample space, so they cannot be refuted by the data. So it seems that the empirical content of statistical hypotheses is unclear. This paper is an attempt to clarify what statistical hypotheses are, and to pin down their empirical content. Perhaps unsurprisingly, I will employ Von Mises' conception of probability. I will end my talk by sketching soundness and completeness proofs for Bayesian statistical inference, employing the well-known convergence theorems by Gaifman and Snir.

Maximilian Schlosshauer (University of Copenhagen)
Quantum probabilities from entanglement?
Can the notion of quantum probabilities and their quantitative representation in terms of Born's rule be derived from considerations of the properties of entangled states alone? In a series of papers, Wojciech Zurek has suggested that this can indeed be done, while a number of authors (including, in an early joint publication, Arthur Fine and myself) have pointed out flaws and hidden assumptions in Zurek's derivation. Synthesizing these criticisms and taking them a step further, I will present what I believe amounts to additional challenges to Zurek's approach. In particular, I will argue that the meaning and nature of the probabilities appearing in the derivation is insufficiently defined, a point that in somewhat different form has also been made by Barnum and Mohrhoff. In fact, the different steps of the derivation appear to embrace distinct probability concepts. I will also examine Zurek's treatment of the case of unequal coefficients and show that it poses further questions with regard to the physical role of the ancilla systems involved in the derivation. More constructively, I shall try to elucidate whether we might be able to salvage Zurek's basic ideas to build a conceptually more sturdy edifice that could lead to a satisfactory account of quantum probabilities and Born's rule as emergent from the bare quantum-state formalism.

Jan Sprenger (Tilburg University)
The Logic of Explanatory Power (with Jonah Schupbach)
In this paper, we defend Bayesian Explanationism by offering an account of an hypothesis’s explanatory power relative to some evidence. We begin with a critical discussion of McGrew’s (2003) previous attempt at providing a Bayesian measure of explanatory power. Subsequently, we apply our insights in order to show that a set of deeply plausible conditions of adequacy for any acceptable measure of explanatory power determine a unique probabilistic measure. By highlighting several theorems of our measure, we defend it as an accurate account of an hypothesis’s explanatoriness, relative to some evidence.
Lastly, we spell out the implications of our measure for Bayesian Explanationism, and the relationship between explanations and reasons.

Patrick Suppes (Stanford University)
Neglect of Independence and Randomness in the Axioms of Probability
The classic 1933 axiomatization of Kolmogorov is purely measure-theoretic. There is no mention or use of the concepts of independence or randomness, which as ordinarily formulated, are the most important probability concepts not shared by all finite additive measures. But simply setting the sum to be one is only a convention in the context of Kolmogorov's other axioms, without justification. (Kolmogorov does discuss in detail independence immediately after stating his axioms.) Axioms using independence and randomness can be added to eliminate this defect, as is made particularly clear by considering only qualitative axioms in the spirit of Hilbert’s well-known on geometry. Such qualitative comparative axioms of additive probability have been much investigated, and those for independence less so. I give what I think are new qualitative comparative axioms for randomness (or uncertainty), and then explore, in this qualitative setting using Padoa's principle, the independence of these three basic concepts: comparative additive probabilities, independence and randomness. The qualitative axiom that there is no randomness or uncertainty in the set Ω of all possible outcomes is the one that requires the measure of Ω be one.

Gregory Wheeler (Center for Artificial Intelligence, New University of Lisbon)
Coherence, Confirmation, and Causation
While there are probabilistic models for confirmation and, more recently, probabilistic models for epistemic coherence, it is generally recognized that probability alone is not enough to explain how the two might go together as the coherence theory of justification says they should. But there isn't a very general understanding of why this is so. In this talk I present results (joint with Richard Scheines, CMU)  illustrating how the relationship between coherence and confirmation is mediated through the causal structure that is believed to regulate the relationship between evidence and hypothesis, or between witness reports and the content of those reports as the case may be. Causal coherence theory provides a general account for when coherence and confirmation are tightly linked and when they are not, and it makes clear why the impossibility results are better viewed as a consequence of the causal structure regulating models for witness testimony than a general result about probabilistic theories of coherence.

Jon Williamson (University of Kent)
An objective Bayesian account of confirmation
Bayesian accounts of confirmation explicate the degree to which total evidence E confirms a proposition θ as the degree to which an agent who grants E ought to believe θ. In turn, rational degree of belief is explicated in terms of probability. But there the concensus ends—Bayesians disagree as to which further conditions must be satisfied in order for degrees of belief to count as rational and disagree as to the extent to which evidence objectively determines degrees of belief. While Rudolf Carnap argued that confirmation is largely objective, his program was deemed to fall to several compelling objections, and subjective Bayesianism is now by far the most dominant position when accounting for confirmation.
In this talk I shall present an objective Bayesian approach according to which an agent’s degrees of belief should conform to three norms: they should be probabilities, should be calibrated with evidence of physical probabilities, and should otherwise equivocate between the basic outcomes expressible in her language. It turns out that problems to do with Carnap’s original objectivist theory of confirmation can be overcome if these norms are cashed out in a suitable way.

John Worrall (LSE)
The No Miracles Argument: what argument?
The ‘explanationist defence’ of Scientific Realism (Psillos 1999) is untenable. There is no method of inference to the best explanation worth the name. In science we simply formulate theories as best we can and hope they withstand tests. The history of science shows that we learn to find ‘explanatory’ whatever theories work – that is, withstand independent (predictive) testing. The so-called meta-level ‘abductive’ inference to scientific realism as the best explanation of science’s success, on the other hand, is plainly non-independently testable and is therefore clearly not a scientific explanation. The responses to the obvious circularity charges, moreover, are unconvincing.
The only honest position is to acknowledge that there is no general ‘No Miracles Argument’ worth the name. Instead particular scientific successes induce the No Miracles Intuition. This intuition represents the slender thread on which Scientific Realism of any stripe must hang.

Abstracts Team D

Guido Bacciagaluppi (University of Aberdeen)
Quantum probabilities and time symmetry in the classical regime
It is generally accepted that the classical regime of quantum mechanics is characterised by the presence of decoherence effects. In this context, it is, however, puzzling that decoherence is typically a time-directed phenomenon (one can intuitively think of it in terms of wavefunction branching), while classical mechanics is a time-symmetric theory. Building on previous work in which I have criticised the standard discussions of time symmetry in the decoherence literature, I suggest a way for approaching and hopefully resolving this puzzle.

Anouk Barberousse (ENS, Paris)
The Mathematics of Approximation and Their Use in Explanation
When approximations are used in explanations of phenomena, it is crucial to distinguish between two criteria according to which a numerical value is said to be approximate: exactness and precision. A more precise approximation is not always a better approximation in the sense that it does not always help us to achieve a better explanation. The talk is devoted to examples in which a gain in precision leads to us away of a good explanation. There are diverse reasons for this failure, which will be analyzed.

Jeremy Butterfield (Trinity College, Cambridge)
Against Pointillisme: A Call to Arms
This paper forms part of a wider campaign: to deny pointillisme.
That is the doctrine that a physical theory's fundamental quantities are defined at points of space or of spacetime, and represent intrinsic properties of such points or point-sized objects located there.
In this paper, I focus on spatial extrinsicality: i.e. on what an ascription of a property implies about other places. The main idea is that the classical mechanics of continuous media (solids or fluids) involves a good deal of spatial extrinsicality---which seems not to have been noticed by philosophers, even those who have no inclination to pointillisme.

Dennis Dieks (University of Utrecht)
The Gibbs Paradox Revisited
It is sometimes claimed that only quantum mechanics can provide a fully satisfactory solution of the Gibbs paradox: according to quantum mechanics particles of the same kind are “indistinguishable” and this explains that there is no mixing entropy when two volumes of the same gas are mixed. By contrast, I will argue that analysis of the paradox supports the notion that particles are always distinguishable. I claim that this analysis extends even to quantum mechanics: the universally accepted notion that identical quantum particles are indistinguishable rests on a confusion about what quantum particles are.

Mauro Dorato (University of Rome 3)
Can we have an objective now in spacetime physics?
In recent times, there have been some attempts at introducing an objective present in Minkowski spacetime, one that, however, would also be capable of accommodating our psychological, or subjective experience of the passage of time (Arthur 2006, Savitt 2006, 2008). Considering that Minkowski spacetime is the arena for three of the four interactions postulated by contemporary physics, if correct this thesis would be of the utmost importance. In this paper, I will evaluate their proposal in light of spacetime physics in general and in view of some finding about our psychophysical experience of the present

Carl Hoefer (University of Barcelona)
Can Boltzmannian Statistical Mechanical Probabilities Be Objective?
The well known tension between determinism and objective chanciness takes an especially acute form in the case of Boltzmannian Statistical Mechanics.  We are strongly motivated to avoid a purely epistemic reading of BSM probabilities, and indeed any obvious epistemic reading looks implausible once spelled out in detail. Yet propensity-type objective accounts of probability are straightforwardly incompatible with deterministic underpinnings.  So there is reason to think that only a Humean/actualist account of probabilities in BSM can give us the objectivity we want.  I will discuss the advantages and difficulties such an account faces in the context of BSM.

Klaas Landsman: (University of Nijmegen)
Probability and Truth in Quantum Mechanics
The recently developed topos-theoretic approach to quantum mechanics may be construed as an attempt to redefine truth along the Kripkean lines of listing possible worlds in which a given proposition holds. This development is explained, along with its relationship to the familiar Born probabilities of quantum mechanics. 

Tomasz Placek (Jagiellonian University, Cracow)
A Locus for “Now” - a Modal Perspective
The notion of “now” seems to require a non-trivial relation of being co-present. Yet, as the arguments of Putnam, van Benthem and Stein show, in special relativity (SR) any transitive relation that satisfies some other natural requirement for co-presence, is either the identity or the universal relation.  Accordingly, there is no good locus for “now” in SR. I will show that if some extra modal structure is added to SR, an interesting relation of co-presence can be defined, leading to  “nows” of possibly varying spatiotemporal extension. I will also point out how this approach can be generalized beyond SR

Michiel Seevinck (University of Utrecht)
Analyzing Passion at a Distance: Progress in Experimental Metaphysics?
Violations of the well-known Clauser-Horne-Shimony-Holt (CHSH) inequality have been frequently analyzed in terms of the well-known conditions of Parameter Independence (PI) and Outcome Independence (OI), introduced by Jarrett and Shimony in the mid-1980ties. In the doctrine of Experimental Metaphysics it is violation of the latter condition (i.e. OI) that is supposed to be responsible for the violation of the CHSH inequality, and it has been extensively argued by many philosophers of this school that this is not an instance of action at a distance but of some innocent 'passion at a distance': one passively comes to know the faraway outcome, but one cannot actively change it.   However, upon closer scrutiny it turns out that a satisfactory analysis of this passion at a distance has not yet been given. In this talk I present such an analysis, which I claim to show some surprising results.

Henrik Zinkernagel (University of Granada)
Cosmic time and quantum fundamentalism
On a quantum fundamentalist view, the following question naturally arises: If the universe – either its content or in its entirety – was once (and still is) quantum, how come that there are (apparently) classical structures now? In cosmology, an assumed gradual emergence of classicality is framed in terms of a cosmic time parameter associated with the standard FLRW model. In this talk I examine the physical foundations for setting up this FLRW model and I argue that the very notion of cosmic time in the model is crucially related to a classical behaviour of the material constituents of the universe (e.g. due to the so-called Weyl principle). This threatens the quantum fundamentalist idea that the material constituents of the universe could be described exclusively in terms of quantum theory at some early stage of the universe.

Abstracts Team E

Session E1 (Monday 14.00-17.30)

1.         Berna Kilinc (Bogaci University, Turkey)
“Kant on Scientific Explanation”

Kant’s philosophy of nature displays a drive to disclose necessity, circumscribe possibility, and eliminate chance.  The nature of explanation per se was not a primary concern in Kant’s philosophy, yet the anthropocentric desire to explain, to find the conditions, was.  Scientific explanation for Kant required a tightly organized systematic account of phenomena.  I examine how this systematicity in turn was, for him, the clue to the a priori knowledge of the necessities in nature.

2.         Elisabeth Nemeth (University of Vienna, Austria)
“Zilsel’s Adoption of Mach’s Conception of Scientific Laws”

Edgar Zilsel, who is recognized today as one of the pioneers of science studies, carried out cultural and historical studies with the aim of discovering laws of human history and culture that would be of the same kind as those discovered in the natural sciences.  This fundamental similarity of scientific laws can only be seen, according to Zilsel, if one accepts the critical revision of the concept of law in modern physics proposed among others by Ernst Mach who showed „that searching for natural laws means discovering functional relationships connecting states and processes in nature with one another” (Zilsel 1927). Applied to the social and cultural sciences, Mach’s insights, according to Zilsel, could contribute to resolving the problem of historical laws and also avoid „unfruitful discussions“ both in these fields and in the natural sciences.

3.         Michael Stöltzner (University of Wuppertal)
Development of the Pragmatic Justification of Induction”

Reichenbach’s pragmatic justification of induction strongly influenced the debates about inductive inference and statistical explanation that were waged in the 1950s and 1960s; and it was especially Wesley Salmon who emphasized both its merits and the fact that, in the end, it did not resolve Hume’s problem. Rather than rehearsing these fairly well-known debates I will focus on the historical roots of Reichenbach’s thesis and argue that his pragmatic justification of induction, as outlined especially in his 1938 book Experience and Prediction, represented a major shift from an initially transcendental conception of induction that he advocated, in changing cast, from his Ph.D. dissertation in 1915 until the early 1930s. This development took place against the backdrop of Reichenbach’s changing interpretation of the principle of causality and his evolving interpretation of probability.

4.         Artur Koterski (University of Lublin, Poland)
”The Rise and Fall of Falsificationism”

In his review of Logik der Forschung Neurath formulated a variety of severe objections.   Assuming a version of Lakatosian taxonomy I ask if any stage of falsificationism was able to meet this oldest criticism. The answer is positive, however, there was quite significant price to pay if Popper’s teaching was to overcome the difficulties already pointed out by Neurath. The changes introduced by ‘sophisticated Popperians’ reached the very core of this doctrine forcing its followers to reject the central tenets and aims of Logik der Forschung.

Session E2  (Tuesday 9.30-13.00)

1.         Friedrich Stadler (University of Vienna, and Institute Vienna Circle, Austria) "Historical Explanation in the Methodenstreit of the Historians"

The so-called Methodenstreit, which originated in economics between C. Menger and G. Schmoller was followed by another variant in the historical sciences in the last decade of the 19th century. The contested methodological question was the existence, role and function of laws in history and the cultural sciences as necessary elements of any possible historical explanation (K. Lamprecht and L.M. Hartmann) - an alternative methodology to an exclusive intuitive understanding (“Verstehen”) with G.v.Below, F. Meinecke and other proponents of German historicism (“Historismus”). M. Weber can be seen as a mediator between the two currents arguing towards a unification of the cultural and social sciences (F. Ringer). This still unresolved dispute will be reconstructed and evaluated in the context of dealing with scientific explanation in 20th century's philosophy of science.

2.         Thomas Uebel (University of Manchester, U.K.)
Geisteswissenschaften in the Methodenstreit in Economics”

Commonly the origin of the separation of the distinctive methodologies of the natural and social sciences is traced to the work of Dilthey who synthesised and systematicised conceptions stemming from the hermeneutical tradition with the practice of 19th century historiography.  This paper will examine the hypotheses that this separation has yet another root not as widely recognised so far, namely, in the methodological dispute of the late 19th century between the so-called German Historical School and the Austrian School of Exact Economics, and that the separation of types of disciplines plays a not inconsiderable part in how central arguments in the socialist calculation debate in political economy in the first half of the 20th century are evaluated.

3.         Eric Schliesser (Leiden University, The Netherlands)
“The Stigler-Kuhn Correspondence and the Philosophical Prehistory of Prediction in Chicago Economics”

The most famous methodological piece within twentieth century economics is Milton Friedman's "Methodology of Positive Economics." This piece is taken to advocate prediction-ism as the most important goal for and criterion of the success of economic theories. While recent scholarship (by Kevin Hoover and Malcom Rutherford, especially) has debunked the once popular myth that Friedman's so-called 'instrumentalist' methodology is derived from or a simple application of Popper, some important methodological sources remain to be explored. In this paper I use Friedman's well known colleague, George Stigler, to shed light on Chicago methodology. I call attention to surprising facts: first, I use archival material to introduce the Stigler-Thomas Kuhn correspondence; second, I explain the surprising Weberian roots (as mediated by Frank Knight and Talcott Parsons) to "Chicago methodology".

4.         Gürol Irzık (Sabancı University, Turkey) and Peter Spirtes (Carnegie Mellon University, USA)
A History of Causal Modeling

This paper aims to give a historical overview of the origin and development of causal modeling. It traces the birth of causal modeling back to the works of Sewall Wright during the first quarter of the 20th century and then shows how social scientists, such as Herbert Simon and Hubert Blalock, developed a similar approach in the fifties and sixties, quite independently of Wright's works. The paper then discusses how philosophers of science discovered causal modeling in the mid-eighties, what use they made of it and how they contributed to it since then.

Session AE3  (Tuesday 14.00-17.30)

2.         Graham Stevens (University of Manchester, UK)
“Russell on Non-Demonstrative Inference”

An important theme in Russell’s later (1940 onwards) work is his concern with the kinds of inference employed in scientific reasoning. Russell argues that such inferences are neither deductive nor, strictly speaking, inductive, as they are not licensed by any deductive inference rule, or by the principle of induction. Furthermore, he thinks that the principle of induction in its purest form is demonstrably false. Scientific reasoning is justified by appeal to a small number of fundamental postulates, he concluded. In this paper I will offer a reconstruction and assessment of the model of inference that Russell envisaged underlying scientific reasoning.

4.         Pierre Wagner (University of Paris I, France)
“Carnap's Theories of Confirmation”

In the forties, Carnap devised a definition for the concept degree of confirmation for a class of simple languages, hoping to extend it to more comprehensive languages later. This definition was the basis of his project of inductive logic, which he conceived as a branch of semantics. In this paper, I will give an overview of: i) Carnap’s first attempt at a definition of degree of confirmation; ii) some objections this attempt had to face; iii) Carnap’s amendments, which led to other theories of confirmation.