PhD, Postdoc with Rosalie Iemhoff

Postdoc position in Logic at Utrecht University, the Netherlands.

The postdoc is embedded in the research project “Optimal Proofs” funded by the Netherlands Organization for Scientific Research led by Dr. Rosalie Iemhoff, Department of Philosophy and Religious Studies, Utrecht University. The project in mathematical and philosophical logic is concerned with formalization in general and proof systems as a form of formalization in particular. Its mathematical aim is to develop methods to describe the possible proof systems of a given logic and establish, given various criteria of optimality, what the optimal proof systems of the logic are. Its philosophical aim is to develop general criteria for faithful formalization in logic and to thereby distinguish good formalizations from bad ones. The mathematical part of the project focusses on, but is not necessarily restricted to, the (non)classical logics that occur in computer science, mathematics, and philosophy, while the philosophical part of the project also takes into account domains where formalization in logic is less common. The postdoc is expected to contribute primarily to the mathematical part of the project. Whether the research of the postdoc also extends to the philosophical part of the project depends on his or her interests.

Qualifications

We are looking for a talented and dedicated researcher with a PhD in logic, preferably in mathematical or philosophical logic, with excellent track record and research skills relative to experience, excellent academic writing and presentation skills, and publications in high-level journals or books.

Additional information

For more information on the practical details of the positions and the application procedure , please visit
https://www.academictransfer.com/nl/47996/postdoc-position-in-logic-10-fte/
https://www.uu.nl/en/organisation/working-at-utrecht-university/jobs
For more information on the project, please contact Rosalie Iemhoff at r.iemhoff@uu.nl.

Deadline for applications: 22 June, 2018.

PhD position in Logic at Utrecht University, the Netherlands.

The PhD position is embedded in the research project “Optimal Proofs” funded by the Netherlands Organization for Scientific Research led by Dr. Rosalie Iemhoff, Department of Philosophy and Religious Studies, Utrecht University. The project in mathematical and philosophical logic is concerned with formalization in general and proof systems as a form of formalization in particular. Its mathematical aim is to develop methods to describe the possible proof systems of a given logic and establish, given various criteria of optimality, what the optimal proof systems of the logic are. Its philosophical aim is to develop general criteria for faithful formalization in logic and to thereby distinguish good formalizations from bad ones. The mathematical part of the project focusses on, but is not necessarily restricted to, the (non)classical logics that occur in computer science, mathematics, and philosophy, while the philosophical part of the project also takes into account domains where formalization in logic is less common. The PhD student is expected to contribute primarily to the mathematical part of the project. Whether the research of the PhD student also extends to the philosophical part of the project depends on his or her interests.

Qualifications

We are looking for a talented and dedicated student with a master’s degree or equivalent degree in mathematics, computer science, or philosophy, specializing in logic or a related area.

Additional information

For more information on the practical details of the positions and the application procedure , please visit
https://www.academictransfer.com/nl/47995/phd-position-in-logic-10-fte/
https://www.uu.nl/en/organisation/working-at-utrecht-university/jobs
For more information on the project, please contact Rosalie Iemhoff at r.iemhoff@uu.nl.

Deadline for applications: 22 June, 2018.

Raymond Smullyan

Proof by legerdemain

Peli Grietzer shared a blog post by David Auerbach on Twitter yesterday containing the following lovely quote about Smullyan and Carnap:

I particularly delighted in playing tricks on the philosopher Rudolf Carnap; he was the perfect audience! (Most scientists and mathematicians are; they are so honest themselves ‘that they have great difficulty in seeing through the deceptions of others.) After one particular trick, Carnap said, “Nohhhh! I didn’t think that could happen in any possible world, let alone this one!”

 

In item # 249 of my book of logic puzzles titled What Is the Name of This Book?, I describe an infallible method of proving anything whatsoever. Only a magician is capable of employing the method, however. I once used it on Rudolf Carnap to prove the existence of God.

 

“Here you see a red card,” I said to Professor Carnap as I removed a card from the deck. “I place it face down in your palm. Now, you know that a false proposition implies any proposition. Therefore, if this card were black, then God would exist. Do you agree?”

 

“Oh, certainly,” replied Carnap, “if the card were black, then God would exist.”

 

“Very good,” I said as I turned over the card. “As you see, the card is black. Therefore, God exists!”

 

“Ah, yes!” replied Carnap in a philosophical tone. “Proof by legerdemain! Same as the theologians use!”

 

Raymond Smullyan, 5000 BC and Other Philosophical Fantasies. New York: St. Martin’s Press, 1983, p. 24.

See Auerbach’s post for more Carnap and Smullyan anecdotes.

Why φ?

Yesterday, @gravbeast asked on Twitter,

Does anyone know why we traditionally use Greek phi and psi for metasyntactic variables representing arbitrary logic formulas? Is it just because ‘formula’ begins with an ‘f’ sound? And chi was being used for other things?

Although Whitehead and Russell already used φ and ψ for propositional functions, the convention of using them specifically as meta-variables for formulas seems to go back to Quine’s 1940 Mathematical Logic. Quine used μ, ν as metavariables for arbitrary expressions, and reserved α, β, γ for variables, ξ, η, σ for terms, and φ, χ, ψ for statement. (ε, ι, λ had special roles.) Why φ for statements? Who knows. Perhaps simply because Whitehead and Russell used it for propositional functions in Principia? Or because “p” for “proposition” was entrenched, and in classic Greek, φ was a p sound, not f?

The most common alternative in use at the time was the use of Fraktur letters, e.g., \(\mathfrak{A}\) as a metavariable for formulas, and A as a formula variable; x as a bound variable and \(\mathfrak{x}\) as a metavariable for bound variables. This was the convention in the Hilbert school, also followed by Carnap. Kleene later used script letters for metavariables and upright roman type for the corresponding symbols of the object language. But indicating the difference by different fonts is perhaps not ideal, and Fraktur may not have been the most appealing choice anyway, both because it was the 1940s and because the type was probably not available in American print shops.

Logic Colloquium, Udine

The European Summer Meeting of the Association of Symbolic Logic will be in Udine, just north of Venice, July 23-28. Abstracts for contributed talks are due on April 27. Student members of the ASL are eligible for travel grants!

lc18.uniud.it

 

The Significance of Philosophy to Mathematics

If you wanted to explain how philosophy has been important to mathematics, and why it can and should continue to be, it would be hard to do it better than Jeremy Avigad. In this beautiful plea for a mathematically relevant philosophy of mathematics disguised as a book review he writes:

Throughout the centuries, there has been considerable interaction between philosophy and mathematics, with no sharp line dividing the two. René Descartes encouraged a fundamental mathematization of the sciences and laid the philosophical groundwork to support it, thereby launching modern science and modern philosophy in one fell swoop. In his time, Leibniz was best known for metaphysical views that he derived from his unpublished work in logic. Seventeenth-century scientists were known as natural philosophers; Newton’s theory of gravitation, positing action at a distance, upended Boyle’s mechanical philosophy; and early modern philosophy, and philosophy ever since, has had to deal with the problem of how, and to what extent, mathematical models can explain physical phenomena. Statistics emerged as a response to skeptical concerns raised by the philosopher David Hume as to how we draw reliable conclusions from regularities that we observe. Laplace’s Essai philosophique sur la probabilités, a philosophical exploration of the nature of probability, served as an introduction to his monumental mathematical work, Théorie analytique des probabilités.

 

In these examples, the influence runs in both directions, with mathematical and scientific advances informing philosophical work, and the converse. Riemann’s revolutionary Habilitation lecture of 1854, Über die Hypothesen welche der Geometrie zu Grunde liegen (“On the hypotheses that lie at the foundations of geometry”), was influenced by his reading of the neo-Kantian philosopher Herbart. Gottlob Frege, the founder of analytic philosophy, was a professor of mathematics in Jena who wrote his doctoral dissertation on the representation of ideal elements in projective geometry. Late nineteenth-century mathematical developments, which came to a head in the early twentieth-century crisis of foundations, provoked strong reactions from all the leading figures in mathematics: Dedekind, Kronecker, Cantor, Hilbert, Poincaré, Hadamard, Borel, Lebesgue, Brouwer, Weyl, and von Neumann all weighed in on the sweeping changes that were taking place, drawing on fundamentally philosophical positions to support their views. Bertrand Russell and G. H. Hardy exchanged letters on logic, set theory, and the foundations of mathematics. F. P. Ramsey’s contributions to combinatorics, probability, and economics played a part in his philosophical theories of knowledge, rationality, and the foundations of mathematics. Alan Turing was an active participant in Wittgenstein’s 1939 lectures on the foundations of mathematics and brought his theory of computability to bear on problems in the philosophy of mind and the foundations of mathematics.

Go and read the whole thing, please. And feel free to suggest other examples!

The book reviewed is Proof and Other Dilemmas: Mathematics and Philosophy, Bonnie Gold and Roger A. Simons, eds., Mathematical Association of America, 2008

[Photo: Bertrand Russell and G. H. Hardy as portrayed by Jeremy Northam and Jeremy Irons in The Man Who Knew Infinity, via MovieStillsDB]

Ptolemaic Astronomy

Working on the chapters on counterfactual conditionals for the Open Logic Project, I needed some illustrations for David Lewis’s sphere models, which he jokingly called “Ptolemaic astronomy.” Since Franz Berto joked that this should just require \usepackage{ptolemaicastronomy}, I wrote some LaTeX macros to make this easier using TikZ. You can download ptolemaicastronomy.sty (it should work independently of OLP); examples are in the OLP chapter on minimal change semantics (PDF, source).

(This will probably interest a total of two people other than me so I didn’t spend much time documenting it, but if you want to use it and need help just comment here.)

Update: it’s now in its own github repository and properly documented.

A New University of Calgary LaTeX Thesis Class based on Memoir

The University of Calgary provides a LaTeX thesis class on its website. That class is based on the original thesis class, modified over the years to keep up with changes to the thesis guidelines of the Faculty of Graduate studies. It produces atrocious results. Chapter headings are not aligned properly. Margins are set to 1 inch on all sides, which results in unreadably long lines of text. The template provided sets the typeface to Times New Roman. Urgh.  A better class (by Mark Girard) is already available, which however also sets the margins to 1 inch. FGS no longer requires that the margins be exactly 1 inch, just that they are at a minimum 1 inch. So we are no longer forced to produce that atrocious page layout.

I made a new thesis class. It’s based on memoir, which provides some nice functionality to compute an attractive page layout. By default, the class sets the thesis halfspaced, 11 point type, and with about 65 characters per line. This produces a page approximating a nicely laid out book page.  The manuscript class option sets it up for 12 point, double spaced, with 72 characters per line, and 25 lines per page. That’s still readable, but gives you extra space between the lines for annotations and editing marks, and wider margins. There are also class options to load some decent typefaces (palatino, utopia, garamond, libertine, and, ok, times).

Once upon a time, theses were typed on a typewriter and submitted to the examination committee in hardcopy. Typewriter fonts are “monospaced,” i.e., every character takes the same amount of space. “Elite” typewriters would print 12 characters per inch, or 72 characters per 6 inch line, and “Pica” typewriters 10 cpi, or 60 characters per line. Typewriters fit 6 lines into a vertical inch, or 25 lines per double-spaced page. A word is on average 5 characters long, hence we get about 250 words per manuscript page.

Noone uses typewriters anymore to write theses, but thesis style guidelines are still a holdover from the time we did. The guidelines still require that theses be halfspaced or double spaced. But of course they allow use of word processing software. Those don’t use monospaced typewriter fonts, and the recommended typefaces such as Times Roman are much more narrow and proportionally spaced. That means even with 12 point type, a 6” line now contains 89 characters on average, rather than 60. (Chris Pearson has estimated “character constants” for various typefaces which you can use to estimate the average number of characters per inch in various type sizes. For Times New Roman, the factor is 2.48. At a line length of 6”, i.e., 432 pt, and 12 pt type that gives 432 × (2.48/12)=89.28 characters per line. With minimal margins of 1” you get 96 characters per line.)

Applying typewriter rules to electronically typeset manuscripts results in lines that are very long—and that means they are hard to read. Ideally, there should be anywhere between 50 and 75 characters per line, and 66 characters is widely considered ideal. Readability is a virtue you want your thesis to have. And the thesis guidelines, thankfully, no longer set the margins, but only require minimum margins of 1” on all sides.

sample-thesis

Modal Logic! Propositional Logic! Tableaux!

Lots of new stuff in the Open Logic repository! I’m teaching modal logic this term, and my ambitious goal is to have, by the end of term or soon thereafter, another nicely organized and typeset open textbook on modal logic. The working title is Boxes and Diamonds, and you can check out what’s there so far on the builds site. This project of course required new material on modal logic.  So far this consists in revised and expanded notes by our dear late colleague Aldo Antonelli. These now live in content/normal-modal-logic and cover relational models for normal modal logics, frame correspondence, derivations, canonical models, and filtrations. So that’s one big exciting addition. Since the OLP didn’t cover propositional logic separately, I just now added that part as well so I can include it as review chapters. There’s a short chapter on truth-value semantics in propositional-logic/syntax-and-semantics. However, all the proof systems and completeness for them are covered as well. I didn’t write anything new for those, but rather made the respective sections for first-order logic flexible. OLP now has an FOL “tag”: if FOL is set to true, and you compile the chapter on the sequent calculus, say, you get the full first-order version with soundness proved relative to first-order structures. If FOL is set to false, the rules for the quantifiers and identity are omitted, and soundness is proved relative to propositional valuations. The same goes for the completeness theorem: with FOL set to false, it leaves out the Henkin construction and constructs a valuation from a complete consistent set rather than a term model from a saturated complete consistent set. This works fine if you need only one or the other; if you want both, you’ll currently get a lot of repetition. I hope to add code so that you can first compile without FOL then with, and the second pass will refer to the text produced by the first pass rather than do everything from scratch. You can compare the two versions in the complete PDF. Proofs systems for modal logics are tricky; and many systems don’t have nice, say, natural deduction systems. The tableau method, however, works very nicely and uniformly. The OLP didn’t have a chapter on tableaux, so this motivated me to add that as well. Tableaux are also often covered in intro logic courses (often called “truth trees”), so having them as a proof system included has the added advantage of tying in better with introductory logic material. I opted for prefixed tableaux (true and false are explicitly labelled, rather than implicit in negated and unnegated formulas), since that lends itself more easily to a comparison with the sequent calculus, but also because it extends directly to many-valued logics. The material on tableaux lives in first-order-logic/tableaux. Thanks to Clea Rees for the the prooftrees package, which made it much easier to typeset the tableaux, and to Alex Kocurek for his tips on doing modal diagrams in Tikz.

Making an Accessible Open Logic Textbook (for Dyslexics)

In the design and layout of the Open Logic Project texts as well as the Calgary Remix of the intro text forall x, we’ve tried to follow the recommendations of the BC Open Textbook Accessibility Toolkit already. Content is organized into sections, important concepts are highlighted (e.g., colored boxes around definitions and theorems), chapters have summaries, etc. We picked an easily readable typeface and set line and page lengths to enhance readability according to best (text)book design practices and research. We’ve started experimenting specifically with a version of forall x that is better for dyslexic readers (see issue 22). Readability for dyslexics is affected by typeface, type size, letter and line spacing. Charles Bigelow gives a good overview of the literature here. Some typefaces are better for dyslexic readers than others. Generally, sans-serif fonts are preferable, but individual letter design is also relevant. The British Dyslexia Association has a page on it: the design of letters should make it easy to distinguish letters, not just when they are close in shape (e.g., numeral 1, uppercase I and lowercase l; numeral 0, uppercase O and lowercase o, lowercase a) but also when they are upside-down or mirror images (e.g., p and q, b and d; M and W). In one study of reading times and reported preference, sans-serif fonts Arial, Helvetica, and Verdana ranked better than other fonts such as Myriad, Courier, Times, and Garamond, and even the specially designed Open Dyslexic typeface. Although it would be possible to get LaTeX to output in any available typefaces, it’s perhaps easiest to stick to those that come in the standard LaTeX distributions. The typeface that strikes me as best from the readability perspective seems to me to be Go Sans. It was designed by Bigelow & Holmes with readability in mind and does distinguish nicely between p and q; b and d; I, l, and 1, etc. Other things that improve readability:
  • larger type size
  • shorter lines
  • increased line spacing
  • increased character spacing, i.e., “tracking” (although see Bigelow’s post for conflicting evidence)
  • avoid ALL CAPS and italics
  • avoid word hyphenation and right justified margins
  • avoid centered text
The accessible version of forall x does all these things: Type size is set to 12 pt (not optimal on paper, but since this PDF would mainly be read on a screen, it looks large enough). Lines are shorter (about 40 instead of 65 characters per line). Line spacing is set at 1.4 line heights. Tracking is increased slightly, and ligatures (ff, fi, ffi) are disabled. Emphasis and defined terms are set in boldface instead of italics and small caps. Lines are set flush left/ragged right and words not hyphenated. The centered part headings are now also set flush left. The changes did break some of the page layout, especially in the quick reference, which still has to be fixed. There is also some content review to do. In “Mhy Bib I Fail Logic? Dyslexia in the Teaching of Logic,” Xóchitl Martínez Nava suggests avoiding symbols that are easily confused (i.e., don’t use ∧ and ∨), avoid formulas that mix letters and symbols that are easily confused (e.g., A and ∀, E and ∃), and avoid letters in the same example that are easily confused (p, q). She also recommends to introduce Polish notation in addition to infix notation, which would not be a bad idea anyway. Polish notation, I’m told, would also be much better for blind students who rely on screen readers or Braille displays. (The entire article is worth reading; h/t to Shen-yi Liao.) Feedback and comments welcome, especially if you’re dyslexic! There’s a lot more to be done, of course, especially to make the PDFs accessible to the vision-impaired. LaTeX and PDF are very good at producing visually nice output, but not good at producing output that is suitable for screen readers, for instance. OLP issue 82 is there to remind me to get OLP output that verifies as PDF/A compliant, which means in particular that the output PDF will have ASCII alternatives to all formulas, so that a screen reader can read them aloud. Even better would be a good way to convert the whole thing to HTML/MathML (forall x issue 23). forallxyyc-accessible