The Tetrast
Sketcher of various interrelated fourfolds.

Plausibility, verisimilitude, novelty, nontriviality, versus optima, probabilities, information, n-ary givens

December 24, 2013.

Latest significant edit: January 7, 2016. This post still needs work.

IT has sometimes been noted that deductive conclusions claim nothing unentailed by their premisses, nothing informative or newsy in view of the premisses, yet often bring to their premisses a new aspect or perspective (a newness occasionally called 'psychological novelty'). That opposition between information and new perspective or aspect seems an instance of a noteworthy pattern.

The two series spelt out in this post's title line up nicely in such oppositions.

1. Optima & feasibles VERSUS plausibility as natural simplicity whereby surmise compensates for being wild.
Note: by "surmise" I mean abductive inference, pretty much, but definable (novelly or not, I don't know) as inference where the premisses neither deductively imply, nor are deductively implied by, the conclusion.

2. Probabilities VERSUS verisimilitude a.k.a. likelihood (in C. S. Peirce's sense) whereby induction compensates for being expansive.
Note: by "induction" (except in the case of mathematical induction, which is deductive) I mean inference from a sample or fragment to a whole, pretty much, but definable (not too novelly, I hope) as inference where the premisses do not deductively imply, but are deductively implied by, the conclusion.

3. Information, news, VERSUS new aspect whereby 'forward-only' deduction compensates for being constrictive.

4. Givens, data, facts, as n-ary complexuses, VERSUS nontriviality whereby equipollential deduction compensates for being utterly faithful (to the premisses).

Note that the four modes of inference explicitly or implicitly defined above have no generic overlap and exhaust the basic possibilities of classical entailment relations between premisses and conclusion.

I will discuss the oppositions not individually but collectively, and will focus most of all on the patterns made by each series. I will assume that the reader attributes at least some importance to the mathematics of optimization, probability, etc. As to the importance of perspectives brought by conclusions, I will point out that a conclusion generally needs to offer a more or less promising or fruitful perspective or aspect (be it plausibility, likelihood, new aspect, or nontriviality) as a merit and indeed a rationale, in order to help motivate inference and reasoning; each of those perspectival merits is hard to quantify usefully, yet they form a system reflecting that of the inference modes that they help motivate.

The perspectival merits in inference in Table (B) are one-to-one reminiscent of, but NOT equatable to, the deductive topics in Table (A):

(A) Quasi-modal deductive topics
& associated means of quantitative expression.
1. Optima & feasibles.
with directions.

2. Probabilities.

3. Information, 'news'.

4. Givens (data, etc.)
as n-ary complexuses.

Roots, bases, as arities,
allied to other relations,
e.g., transitivity.
(B) Perspectival merits in inference,
resistant to being usefully quantified.
1. Plausibility,
natural simplicity,

brought by surmise.

2. Verisimilitude (in
C. S. Peirce's sense)

brought by induction
(inference from
sample to whole).
3. New aspect
brought by

4. Nontriviality
brought by
(a.k.a. reversible)

Both tables have the following pattern, with notable oppositions along its diagonals:

(C) Pattern in common
1. Simple, doable, compelling.

2. Apt, consistent, consonant.
 X  3. New, distinctive.

4. Complex, structured.

Note: I reached the ordering optima, probabilities, information, givens (i.e., logic) as reflecting some broad correlations with levels of concrete phenomena (see Table (D)), such that it seems an order of being or of (decreasing) abstractness, and its reverse seems an order of (decreasing) familiarity.
(D) Some broad (not too restricted) correlations
1.Optima & feasibles.
—Decision processes.
Motion, forces.
—Stochastic processes.
—Communicational &
 control processes.
4.Givens, data, facts, base̅s
(for further conclusions).
—Inference processes.
(I reached the ordering of difference, ratio, logarithm, base by association with the deductive topics; the result is somewhat striking, although I haven't seen why, from a purely arithmetical viewpoint, logarithm would precede rather than rank alongside or follow base there; also I played, probably too long, with the idea that an inverse not of exponentiation but of tetration ought to be associated with logic, with some notion of power sets floating in my mind.)

The topics in Table (A) are the main topics of abstract and significant mathematically deductive areas concerned with structures of alternatives in terms of timelike or (quasi-)modal cases (see Table (E)).

The perspectival merits of inference listed in Table (B) have familiar forms and some intellectual history (see further below).

(E) Temporal (or at least timelike) quasi-modal cases, combined into an encompassing picture

Lightcone-like Structure Compounded of (Quasi-)Modal Cases

Optima & feasibles: of, or as if of, the almost-now, more or less along the surface of the future.

Probabilities: of, or as if of, the more gradually addressable future. (Zadeh's possibility theory seems to pertain to this area too; he calls it an alternative to probability theory.)

Information, news: of, or as if of, the just-now, more or less along the surface of the past.

Givens, data, facts: of, or as if of, the more settled, layered past.

The lightcone-like structure is evoked by assuming, even if only vaguely, that not only is motion relative but also there is a finite speed limit of communication and causation — e.g., the lightspeed constant in the known physical universe. But I guess that the present purpose could be adequately served by a more Galilean, less constrained kind of relativity picture if it were at least to exclude infinite speeds. Anyway, one will also want to encompass situations in which the top practical speed of some things is much lower than a universal signal-speed limit (light speed) and is not exact, universal (across events), or invariant (across various inertial reference frames); hence, there is often, for example, to allow some vagueness as to whether information arrives at lightspeed or somewhat more slowly.

Fruitful aspects or perspectives given by conclusions to premisses

1. Simplicity of a surmise's conclusion (an explanatory hypothesis) is a familiar idea. One version of it is parsimony or Ockham's Razor; C. S. Peirce held that logical simplicity is secondary and that at its extreme it would add no explanation to a surprising observation; he explored simplicity of explanation as plausibility, facility, naturalness, instinctual attunement, and Galileo's 'natural light of reason' (see the linked passage from "A Neglected Argument," 1908).

2. An inductive conclusion's verisimilitude, in Peirce's sense, is an idea familiar in the form of that which people mean in speaking of an induction as an inductive generalization, expecting the whole to resemble a sample (preferably a fair one).

3. New or novel aspect of conclusion of a 'forward-only' deduction such as a categorical syllogism has been noted by various people including Peirce, and they have typically seen it as pertinent to deduction generally. It is familiar in the sense that it is considered perspectivally redundant, for example, to conclude a deduction merely by restating a premiss in unchanged form.

4. Nontriviality or, in stronger dose, depth of a mathematical conclusion is a familiar idea among mathematicians, and has been an element in the formation of ideas of complexity. Conclusions in pure mathematics tend to be equipollent (propositionally equivalent) to their premisses, as Aristotle noted (Posterior Analytics, Bk. 1, Ch. 12, link to pertinent text).

Each fruitful aspect counterbalances, quasi-ironically, the essential structure of the entailment relations between premisses and conclusions that defines the inference mode to which the aspect pertains. So, the collectively systematic character of the definitive entailment relations is reflected in the collectively systematic character of the fruitful aspects.

1. A surmise's conclusion (with 'surmise' as defined below) is complex in the sense of both adding to, and subtracting from, what the premisses claim, and is of interest when it nonetheless brings a simple perspective.

2. An induction's conclusion is novel in the sense of adding to, but not subtracting from, what the premisses claim, and is of interest when it nonetheless brings a 'conservative' or 'frugal' (verisimilitudinous or 'likely') perspective.

3. A 'forward-only' deduction's conclusion retrenches in the sense of subtracting from, but not adding to, what the premisses claim, and is of interest when it nonetheless brings a novel perspective.

4. An equipollential deduction's conclusion is simple in the sense of neither adding to, nor subtracting from, what the premisses claim, and is of interest when it nonetheless brings a complex or nontrivial perspective.

Note: Induction as actually framed in practice sometimes has a conclusion that does not fully entail its premisses, even though we think of induction as inferring from a part (a sample or fragment) to a whole including the part. See "Deductive vs. ampliative; also, repletive vs. attenuative".)

Deduction is so defined that its conclusions are true if its premisses are true, while other modes of inference lack that character; this has been seen as a problem for the other modes. In fact any inference depends on being correctable in its premisses and procedure through larger inquiry and the testability of conclusions. Certainly surmise to an explanation is the least secure mode of inference, but also the most expeditious, as Peirce pointed out. Now a valid deduction secures its conclusion's truth if its premisses are true, while other inference modes only suggest, with more or less strength, their conclusions' truth, and do so by the perspective they bring and other considerations (methods of sampling, weakness of alternate explanations, etc.) rather than only by general definitive entailment structure. But a deduction's conditional assurance of a true conclusion is counterbalanced by the perspective of novelty or nontriviality that a worthwhile deduction brings; dubitability is natural, and right in a way, in deduction. The deduction's conclusional perspectives (novelty, nontriviality) that, with varying strength, incline one to check one's reasonings and claims for falsity, inconsistency, etc., are the same ones that make deduction worthwhile.

1. A surmise's natural plausibility, and
2. an induction's Peircean verisimilitude,
suggest that a conclusion might be true although non-deductive, and offer the hope of helping inquiry toward eventually finding (respectively) the real natural simplicities and the real reliabilities in things.

3. A 'forward-only' deduction's novelty of aspect, and
4. an equipollential deduction's nontriviality,
suggest that a conclusion might be false despite being or seeming deductive (involving an error perhaps in the reasoning or perhaps in the premisses — indeed, the main purpose of some deductions is to expose the premisses to testing), and offer the hope of helping inquiry toward eventually finding (respectively) the real surprising ramifications and the real deep structures in things.

Correlation between quasi-modal deductive topics and inference modes' valuable aspects (plausibility, verisimilitude, novelty, nontriviality)

Quasi-modal topics of so-called 'applied' but abstract and significant mathematically deductive areas (below). Aspects with merit in inference (below).
  Valuable, fruitful aspects or perspectives into which conclusions put premisses are sorted below by inference mode.
1. Optima & feasibles.
  • Differences (distances) with directions.
  • Mathematics of optimization (longer known as linear & nonlinear programming) applies extremization, calculus of variations, Morse theory.
Naturalness, simplicity, plausibility, viability, cogency.
Suggested technical name: viatility.
  • Pertains to surmise, explanatory hypothesis, that which Peirce called "abductive inference." I propose a (non-Peircean) definition of surmise as inference that automatically preserves neither truth nor falsity (the premisses don't deductively imply the conclusions, and the conclusions don't deductively imply the premisses). Even if one excludes cases where the premiss set is inconsistent with the conclusion set (e.g., "there is a horse, so there isn't a horse"), natural simplicity, cogency, etc., remain an important consideration in the surmises that remain. The inference "There is a horse here, so there is a squid here" is a surmise without natural simplicity or cogency. Now, one could restate a cogent surmise to make it seem, technically, an induction, its premisses implied by but not implying its full conclusions: "Normally after rain the lawn is wet and this morning the lawn is wet. So, normally after rain the lawn is wet, this morning the lawn is wet, and last night it rained." But this is like turning a traditional syllogism into an equipollential deduction by restating all the premisses in the conclusion. Such representation of the inference does not faithfully enough reflect the guiding inquiry interest of the inference and instead counterproductively obscures it with redundancies.
  • Studies of concrete phenomena (that is, sciences and studies physical, material, biological, and human / social) are for drawing conclusions that typically are (cogent) surmises (but they do apply the other modes of inference along the way). As Peirce points out in discussing verisimilitude in induction (Collected Papers Vol. 2, in Paragraph 633): "Strictly speaking, matters of fact never can be demonstrably proved, since it will always remain conceivable that there should be some mistake about it. [....] Indeed, I cannot specify any date on which any certain person informed me I had been born there; and it certainly would have been easy to deceive me in the matter had there been any serious reason for doing so; and how can I be so sure as I surely am that no such reason did exist? It would be a theory without plausibility; that is all." Hence, even a verisimilar and high-confidence induction, if it is from concrete individual cases, rests in the end on premisses which are affirmed because their falsity is implausible, or so taken, and which amount in the end to cogent surmises. Hence the ultimate standing of conclusions sought by studies of concrete phenomena is that of cogent surmise. Now, I've cited Peirce saying something that I believe helps imply my conclusion; but in or around the same year (1910), he held that all determination of probabilities of actual objects (such as actual dice) rests on verisimilitudes (Collected Papers Vol. 8, in Paragraph 224) and did not mention a dependence of those verisimilitudes on plausibilities (or cogencies).
2. Probabilities.
  • Ratios (such that 0 ≤ the ratio ≤ 1).
  • Mathematics of probability applies general measure theory, enumerative combinatorics.
Verisimilitude, likelihood, in C. S. Peirce's sense.
Suggested technical name: veteratility.
  • Pertains to induction, inference that automatically preserves falsity but not truth (the premisses don't deductively imply the conclusions, but the conclusions deductively imply the premisses).
      Verisimilitude in Peirce's sense consists in that, if pertinent further data were to continue, until complete, to have the same character as the data supporting the conclusion, the conclusion would be proven true. The phrase "inductive generalization" is an approximate way of saying "induction with some verisimilitude." The inference "there is a horse here, so there are a horse and a squid here" is an induction without verisimilitude.
  • Studies of positive phenomena in general (studies such as inverse optimization, statistics, information theory's induction-oriented areas, and maybe philosophy) are for drawing conclusions typically inductive.
3. Information, 'news'.
  • Logarithms.
  • "...for every unconstrained information inequality," i.e., every law of information, "there is a corresponding group inequality, and vice versa." — Yeung (2012), p. 381. (Application of group theory reveals non-Shannon-type laws of information, see tutorial by Yeung.)
New aspect, elucidativeness, significance in a sense.
Suggested technical name: novatility.
  • Pertains to 'forward-only' deduction, inference that automatically preserves truth but not falsity (the premisses deductively imply the conclusions, but the conclusions do not deductively imply the premisses). The inference "There are a horse and a squid here, so there is a horse here" is a 'forward-only' deduction without novelty of aspect. The inference "All A is B, and all B is C, so all A is C" is a 'forward-only' deduction with some novelty of aspect.
      Novelty of aspect of the kind sought in 'forward-only' deduction involves some tightening of focus, which is lacking in such a 'forward-only' deduction as "Socrates is yonder, so Socrates is here or yonder." (Still, it is important sometimes to remember how "surprisingly" broad alternatives are deductively implied — especially on those occasions when one is surprised by a subtle role of such implications in one's reasoning.)
  • 'Applied' but abstract and significant mathematically deductive areas — mathematics of optimization, of probability, of information, and of logic — are for drawing conclusions that typically are 'forward-only' deductions. I don't say that equivalences, e.g., that between p and T→p, are unimportant in those fields. But their general aim at 'forward-only' deduction is why they are correlated with areas of inductive research as their "inverses" or "reverses." Reverse pure mathematics, on the other hand, is, if not pure mathematics per se, still mathematical logic about pure mathematics; it would not be the first research program to originate in mathematical logic and then grow into a part of pure mathematics, if some old accounts that I recall about non-standard analysis are correct. My understanding is that probability theory is also often applied in pure mathematics, and not always trivially. These so-called 'applied' areas need another generic name.
4. Givens (data, etc.) as nullary, unary, binary, n-ary, complexuses. (See Arity.)
  • Roots, bases, as arities (a.k.a. adicities, valences) of relations, allied with other relational properties, e.g., transitivity.
  • Mathematical logic applies order theory.
Nontriviality, depth, complexity, 'lessonfulness'.
Suggested technical name: basatility.
  • Pertains to equipollential deduction, inference that automatically preserves truth and falsity alike (the premisses deductively imply the conclusions and vice versa). The inference "There are a horse and squid here, so there are a horse and a squid here" is an equipollential deduction without nontriviality. The inference "3 × 5 = 15, so 3 ∕ 15 = 5" is an equipollential deduction with at least a jot of nontriviality. When the well-orderedness of the pertinent set is a standing given, then the mathematical-induction step, from the ancestral case conjoined with the hereditary case, to the conclusion, is an equipollential deduction.
  • 'Pure' mathematics is for drawing conclusions typically deductive through equivalences and equipollencies. 'Forward-only' deduction sometimes occurs, especially when greater-than or less-than statements are involved. I've read that, because of this, in a mathematical induction a premiss (such as the ancestral case) is not always to be proven by simply reversing the order of one's scratch work; sometimes instead one must find some other way back.

The complexity theorist Cosma Shalizi (in his notes "Complexity" and "Complexity Measures") has said that complexity is ill-defined and its proposed general measures not actually useful. Now, the idea of complexity seems based on the idea of nontriviality in mathematics ("deep" in mathematics means "very nontrivial," I've been told), and the effort at quantifying complexity seems aimed at finding an information-like quantity. Yet, the idea of information as a quantity is rather simple. Now, one might argue that the analogous idea for complexity will be complex because it's about COMPLEXITY, of course. But that is to say that it is not a quantity on a mathematical par with information in the sense that probability is. Complexity, at least in that which seems the idea's core sense of nontriviality and depth as in mathematics, seems akin in various ways, including its importance and tantalizing character, to aspectual novelty, verisimilitude (in C. S. Peirce's sense), and plausibility (natural simplicity). So classing it, one ends up with four kindred aspects in inference, aspects that resist computation-friendly formulations but are vital as forms of value or merit in inference. Data or, more generally, givens, in their character of adicity or arity, with its definabilities and its own share of importance, seem a better "complexity" counterpart to information. For logic, there are, first of all, the structures of alternatives (such as that represented by the variable x) and of compounds conjunctive, conditional, etc., and such is the turn of interest that groups mathematics of logic with those of information, probability, and optimization. Now, when each element of such a compound or relation is itself a relation with the same arity, the arity is like a root or base raised to successive powers, though its sheer size is not the only point. Anyway, logic gains stature as a serious subject, as Quine pointed out, when it comes to the study of relative terms (in polyadic quantification), which have arities of their own; relative terms are a complicating factor in logic. Now, such a relation as the dyadic '__discussing__' is itself not a compound conjunctive, disjunctive, or otherwise; nor are relations such as pure mathematical operations, functions, etc.; yet the forms of such relations are what is applied to represent the forms of alternatives and other such compounds which, for their part, are in a sense (be it literal or figurative) relations among worlds.

Post a Comment

HOME || Deductive vs. ampliative; also, repletive vs. attenuative || Plausibility, verisimilitude, novelty, nontriviality, versus optima, probabilities, information, n-ary givens || Logical quantity & research scopes [...] || Telos, entelechy, Aristotle's Four Causes, pleasure, & happiness || Compare to Aristotle, Aquinas, & Peirce. || Semiotic triad versus tetrad. || Tetrachotomies of future-oriented virtues and vices. || What of these other fours? || Fantastic Four. || Why tetrastic? || The Four Causes, their principles, special relativity, Thomistic beauty. || Logical quantities, categories of research, and categories. || Semiotics: collaterally based recognition, the proxy, and counting-as. || A periodic table of aspects of humanity [...]