Because once you understand type theory, you unlock the ability to win arguments on the internet using words like "monoid" and "dependent elimination". Also, it's comforting to know that while your life may lack structure, your types never will.
I know this is supposed to be a joke but I don’t get it? Type theory isn’t particularly heavy in terms of jargon compared to other foundational theories like category theory or set theory. These are highly technical fields of study so of course they’re going to need specific language. It would like complaining about a doctor using “fancy terms” when discussing complex medical procedures.
To give a non-cynical answer: a good reason to look at type theory - if you’ve been exposed to computer science - is that it treats syntax and syntactical manipulation as foundational and can be viewed as the branch of mathematics that ties syntax and with the notion of computation.
I think the answer is yes. Modern efforts to formalize mathematics use type theory a lot, specifically calculus of inductive constructions, rather than ZFC.
At around 2000, I studied applied mathematics. I was curious about automated theorem proving, but I didn't understand the language (being, as most mathematicians, trained in proving with classical logic rather than type theory). Years later I discovered Haskell, which became a "gateway drug" to type theory. I wish something like Lean had existed 25 years ago, when I was a student.
I think fully formal methods in mathematics are the future, and every math student should learn a bit of Lean (or a similar tool, but IMHO Lean has the best tutorials, aimed at people with traditional education). Hence, everybody should learn basics of type theory.
Topos theory can be added to type theory to provide a categorical semantics.
But even with Grothendieck topology (total and co-total categories) requires sets with least upper bound (join) and a greatest lower bound (meet).
The problem is that you get semi-algorthms, thus will only halt if true.
IMHO it is better to think of recursion as enabling realizability, and forcing fixed points. The ycombinator can be used for recursion because it is a fixed point combinator.
From descriptive complexity FO+LFP=SO+Krom=P, while FO by itself is AC^0 IIRC?
Using model logic+relations is another path.
The problem I have found is those paths tend to require someone to at least entertain intuitionistic concepts and that is a huge barrier.
Mostly people tend to use type theory to convert symantic properties to trivial symantic properties for convenience.
Blackburn/Rijke/Venema's “Modal Logic” is the most CS friendly book for that if you want to try.
Yes, it's interesting to see when coders start dealing with physical world problems and sensor systems and don't realize they are relying on constructivity to do anything at all. I've mostly only met other devs who have also worked in the CRDT space that grok intuitionistic reasoning.
Yes well we all have our pet theories about the world. Thankfully the people that think the natural numbers aren't "real" or that recursion isn't "real" haven't won out and destroyed our ability to make meaningful progress.
I'm curious about what you mean by recursion not being physically real? Do you mean it doesn't convert to CPU instructions, or something around occurence in nature?
There's no call stack in a tail-recursive function. It's equivalent to a loop.
And that, if you think about it a bit, might help you understand what recursion is, because your current understanding seems a bit off.
The only time you need a "call stack" with recursive calls is if the calls need to ultimately return to the call site and continue executing. It's still recursive.
By contrast, the model in which every function call pushes a return address, whether or not it's really needed, is a kind of mistake that comes from not understanding recursion.
Yes. This is correct, tail call optimized recursion is physically different than other types of recursion. It's much needed syntactic and structural sugar for goto.
OT but is this kind of well researched user content be disappearing with AI content filling the web ? Will some places / forums keep power users interested in spending time researching and answering ? When what they produce will be ingested in future models and not sourced back to them
I'm a very small time MSE user but indeed such concerns made me reluctant to answer recently or even participate in general. The CC BY-SA requires both Attribution and ShareAlike which doesn't happen. I expect people will move to private silos of information as much as possible.
I'm just an old country imperative programmer, not a big city functional programmer, so excuse my ignorance, but I had an ephiphany the other day. Functional programmers with strict types organize their data with complex types, while I do the same with classes. I need to do more research to verify this, but it explains the focus on types in functional languages to me.
Likewise here. After BASIC, I learned Pascal, in the early 80s. We were taught the principle that if you designed your data types right, the rest of the program would practically write itself.
What I can't tell you is if data types in computer programs are related to type theory.
When I hear “type theory” - I think Martin-Löf, Girard’s Proofs and Types, Curry-Howard, etc.
Which is not a “theory of programming language types” - although the subjects are not completely isolated from each other.
Then again, it’s been decades since I actually studied this stuff and haven’t really followed either subject since.
As far as the latter is concerned (programming language type systems), as an engineer, I’ve seen nothing in mainstream programming languages that advances the state-of-the-art compared to what was known back in the 1980s - when Hindley-Milner became mainstream and Luca Cardelli wrote the seminal survey papers on the subject.
Typically when an OOP programmer says the word ‘class’ they mean one (or both) of two things, depending on their favourite OOP language:
- a factory for objects, that hides some internal state of the object;
- a static construct that gives a name to some (‘public’) subset of the behaviour of those objects, and which a type-checker can then use to ensure that the use of the objects adheres to the intended contract.
They're both mechanisms for encapsulation of internal behaviour (that may change at any time) while providing a blessed public API that you promise will stay the same (or at least only change in certain ways, under a strict discipline like semver or ‘we'll tell you on the mailing list’). The difference is that the former is runtime (dynamic) behaviour, while the latter is compile-time (static) behaviour. The former maps to functions (or, with state, closures), not types, in functional languages (see ‘objects are a poor man's closures, and vice versa’ [1]), and the latter… is types :) The same typing concept that ensures correct interaction with objects can also be used to ensure correct interaction with functions or other data types, though since functional languages are easier to type than OO languages strictly-typed functional languages tend to have stronger type systems that capture more properties, like effects.
From one perspective, there's a spectrum imperative <-> procedural <-> OOP <-> functional that is about how tightly scoped state changes are. In that view, functional languages are just the natural next step: if the whole point of OOP is to scope state, a functional language is just an OOP language with _even more_ tightly scoped state. The philosophy (uncontrolled state changes are scary and hard to reason about: add language constructs that allow the reader to see at a glance how far they have to look for the knock-on effects of a given state changes) is the same, just the implementation is different.
Thanks for the thoughtful response. As you know, I was drawing a parallel between the compile-time contract that a class provides and the role of complex types in functional programming. In many cases, a module that defines data types alongside functions operating on them feels similar to a class that encapsulates both structure and behavior. I realize there are deeper distinctions and advantages on the functional side, but I’m just trying to simplify and compare the basic conceptual building blocks. I also really liked your spectrum from imperative to functional. It's a helpful framing.
No you're totally right! In fact I think you're even more right than you said in the original post: the analogy you draw isn't just an analogy, they are in fact the exact same thing. A class is (amongst other things) a module, and a closure, and a type. In functional programming these things are (usually!) deconstructed rather than all being facets of the ‘class’ concept, but they're all still there and they all work exactly the same way as in OOP (perhaps modulo mutability, depending on choice of languages — but you can see local mutability as syntax sugar over function calls anyway).
One of the core concepts in type theory is the function type. It's a kind of building block. So perhaps better would be to say functional programmers organize everything using functions.
I suggest you look at simply typed lambda calculus, it gives a basic rationalization of types in functional programming.
In a statically-typed language like Java, every class is a type - though not vice-versa, since things like interfaces and records (in Java) are also types.
Non-OOP statically typed languages are basically just saying "ok that's fine, but there's no need to attach state or behaviour to types".
The OP goes on to immediate ask whether category theory or type theory can serve as a foundation for mathematics (they give a link to an article that talks of "foundational styles" but which doesn't give a process of explicitly "cutting the cord" from set theory).
I think the answer by (perhaps infamous) Paul Taylor in this mathoverflow post[1] is a clearer expression of the positions of those who want to jettison traditional set theory as the foundation of mathematics.
"The crucial issue is whether you are taken in by the Great Set-Theoretic Swindle that mathematics depends on collections (completed infinities)"
Yes, it's important to remember that everything labeled "infinity" in math doesn't "exist-exist" but rather is some combination of unending process and arbitrary labeling (see Skolem's paradox etc). But that doesn't mean such nominally completed infinities aren't useful or even necessary for a human being to grasp mathematics.
Anyway, for this debate it's worth reading all of Taylor's comment (which isn't to say I share his position but it's good to understand where the "finitests" are coming from).
I don’t quite buy this, because we want to abstract over the stuff reachable by a particular generic process, say, over the natural numbers. We want to say, in mathematical language (i.e. as formulas), things like “whatever x we can reach in this kind of process, property P will hold for it”. This requires abstracting over all possible x, and in that abstraction, “all possible x” becomes itself an object. And we call those kinds of objects sets.
Where does one even start learning type theory or category theory? I’ve had a passing interest from listening in on conversation between Rust friends, but never found a place to do a like type theory 101 for software engineers
Books. Some books on my shelf covering these topics:
"Type Theory and Functional Programming" by Simon Thompson (PDF available via quick search)
First 120 pages of "Type Theory and Formal Proof" by Rob Nederpelt / Herman Geuvers
For some early stuff on dependent types (also covered in the above), check out "Intuitionistic Type Theory" by Per Martin-Lof
"Computation and Reasoning: A Type Theory for Computer Science" by Zhaouhui Luo
"Type-Driven Development with Idris" by Edwin Brady
"Gentle Introduction to Dependent Types with Idris" by Boro Sitnikovski
"Types and Programming Languages" by Pierce
"Practical Foundations for Programming Languages" by Robert Harper (or any of his YouTube videos re/Type Theory)
"The Little Typer" by Friedman and Christiansen
"Modal Homotopy Type Theory: The Prospect of a New Logic for Philosophy" by David Corfield (just took this one to OBX North Carolina to enjoy on my beach vacation last week! Got some salt water in those pages. Also speaks to Category Theory.)
For Category Theory:
"The Joy of Abstraction" (2023) by Euginia Cheng
"Basic Category Theory for Computer Scientists" by Benjamin C. Pierce
"Conceptual Mathematics: A first introduction to categories" by Lawvere and Schanuel
"Categories for the Working Philosopher" by Landry
Bartosz Milewski’s Category Theory for Programmers lectures [1] are good. Especially paired with his book [2].
I actually find that category theory is remarkably simple. At least at the level where it applies to day-to-day programming concepts. The one major benefit I got from it, was that it helped me reason about high level/abstract structure better (which helps my architectural designs/thinking).
I enjoyed Types and Programming Languages by Benjamin Pierce. Also vaguely related is the Implementation of Functional Programming Languages by Simon Peyton Jones (this is more about lambda calculus, but I found that when you pair type theory with lambda calculus you get some nice synergies as a programmer).
I watched these https://www.youtube.com/@hottlectures5237, at least up to the point where they seemed to be diverging from things I could see application in. There's an associated book which is fairly tough going but enlightening (easier after watching some of the videos).
Let's say I ask, "Why is it worth spending time in Baltimore?"
I could get answers about the national anthem, about Camden Yards, about good restaurants and bars. Those are "tourist" answers.
Or I could get answers about job availability and cost of housing. Those are "move there" answers.
It seems to me that this article (if you can call it that) gives mostly "tourist" answers, not "move there" answers. Well, that's all right in a way - for an academic topic, you probably have to get tourists before you get people moving there. But the problem is that mathematicians have a large number of areas that they could visit as tourists, and nothing here tells them why type theory is better to visit than, say, algebraic topology.
And that doesn't solve the original complaint. Why are so few people working in type theory? Because more of the action is in set theory and category theory, and type theory offers too little that the others don't, so nobody's going to switch.
I mean, most mathematicians aren't really in foundations anyway. They're in differential equations or prime numbers or complex analysis or algebraic topology or something like that, and making your foundation type theory instead of set theory makes no difference whatsoever to those people. Only the people working on foundations care, and as I said, most of those people already work in set theory or category theory, and type theory doesn't give them a good enough reason to uproot and move.
I’m not sure Set theory and Type theory compete for the same foundations people.
Type theory’s stomping ground is in the foundations of computation. I’m guessing if you sampled a theoretical comp. sci. audience, I’m pretty sure type theory would “win” the popularity contest.
Somewhat similar to intuitionism and constructivism - interesting to people interested in computability but considered almost silly or at least niche by “mainstream” math theoreticians.
> Somewhat similar to intuitionism and constructivism - interesting to people interested in computability but considered almost silly or at least niche by “mainstream” math theoreticians.
This was a source of minor culture clash for me: I spent some time over the past couple of years working through some Coq-based educational materials (mainly Software Foundations led by Pierce) and talking to one of the grad student coauthors about his (PL theory/type theory) research. So I rather strongly internalized from this material that "we" don't automatically assume the Law of the Excluded Middle, but at least warn people if we had to use it (as it makes proofs nonconstructive: you no longer have a method for calculating the objects you are talking about). After this, I commented on something on Scott Aaronson's blog and I happened to say something like "wait, but doesn't that require excluded middle, and you didn't explicitly say so?" and Aaronson was rather confused, basically saying something like "we normal mathematicians assume excluded middle all the time, and we don't say so".
So yeah, there is a strong trend in favor of those things in some parts of type theory, but other mathematicians are then surprised that anyone would care. (And Aaronson in particular is someone very interested in and expert about computability, though one might say from a more traditional core computer science perspective.)
I'm going to reply to my own comment to make a digression.
It's an intuitionistic logic theorem that the excluded middle is "irrefutable", that is, that you can never exhibit a counterexample. You can prove this constructively! This leads to something called the double negation translation, where all theorems that could be proved using excluded middle can be proven constructively to be "irrefutable" (rather than "true"), that is, that it would be impossible for an exception or counterexample to the theorem to exist. So you will get ~~P where classical logic could establish P (and deducing P from ~~P is equivalent to excluded middle).
One argument for not assuming excluded middle in proofs is then that the deductive system automatically shows you exactly which conclusions are constructive and which are not constructive, and you can interpret your deductive system in such a way that you're getting the same answers, just with this additional information or metadata about whether the answer was obtained constructively or nonconstructively. If you prove P, you did it constructively (you can exhibit the objects that P talks about, or extract an algorithm for locating or identifying them); if you prove ~~P, you did it nonconstructively, if "it" is considered as "proving P itself" (your argument shows that it would be impossible for the objects in question definitively not to exist, but you don't necessarily know how to find or calculate them).
So one can say, again, that in working this way you're learning the same stuff about your underlying subject matter, but with additional information attached at the end!
Fair enough, though TFA seemed to be focused on the "foundations of mathematics" angle. If you want to say it's good for foundations of CS... I can probably agree (though the category theory folks may want a word).
Because once you understand type theory, you unlock the ability to win arguments on the internet using words like "monoid" and "dependent elimination". Also, it's comforting to know that while your life may lack structure, your types never will.
I know this is supposed to be a joke but I don’t get it? Type theory isn’t particularly heavy in terms of jargon compared to other foundational theories like category theory or set theory. These are highly technical fields of study so of course they’re going to need specific language. It would like complaining about a doctor using “fancy terms” when discussing complex medical procedures.
To give a non-cynical answer: a good reason to look at type theory - if you’ve been exposed to computer science - is that it treats syntax and syntactical manipulation as foundational and can be viewed as the branch of mathematics that ties syntax and with the notion of computation.
And ‘monoid’ isn't even from type theory!
Not heavy on jargon? Compared to what? It’s all jargon
> compared to other foundational theories like category theory or set theory
> Type theory isn’t particularly heavy in terms of jargon compared to other foundational theories like category theory or set theory.
And anyone who thinks their own area doesn't have jargon has just forgotten how to recognize it.
And quartz, of course.
https://xkcd.com/2501/
This is the type of snark that's HN appropriate.
I think the answer is yes. Modern efforts to formalize mathematics use type theory a lot, specifically calculus of inductive constructions, rather than ZFC.
At around 2000, I studied applied mathematics. I was curious about automated theorem proving, but I didn't understand the language (being, as most mathematicians, trained in proving with classical logic rather than type theory). Years later I discovered Haskell, which became a "gateway drug" to type theory. I wish something like Lean had existed 25 years ago, when I was a student.
I think fully formal methods in mathematics are the future, and every math student should learn a bit of Lean (or a similar tool, but IMHO Lean has the best tutorials, aimed at people with traditional education). Hence, everybody should learn basics of type theory.
As a mereological nihilist I like type theory but I wish it was type and schema theory.
Treating complex types, structure, arrangement, entanglement, etc. as if it has the same realness or priority as primitive types is a mistake.
It's like acting like recursion is physically real (it's not) when it smuggles in the call stack.
Topos theory can be added to type theory to provide a categorical semantics.
But even with Grothendieck topology (total and co-total categories) requires sets with least upper bound (join) and a greatest lower bound (meet).
The problem is that you get semi-algorthms, thus will only halt if true.
IMHO it is better to think of recursion as enabling realizability, and forcing fixed points. The ycombinator can be used for recursion because it is a fixed point combinator.
From descriptive complexity FO+LFP=SO+Krom=P, while FO by itself is AC^0 IIRC?
Using model logic+relations is another path.
The problem I have found is those paths tend to require someone to at least entertain intuitionistic concepts and that is a huge barrier.
Mostly people tend to use type theory to convert symantic properties to trivial symantic properties for convenience.
Blackburn/Rijke/Venema's “Modal Logic” is the most CS friendly book for that if you want to try.
Yes, it's interesting to see when coders start dealing with physical world problems and sensor systems and don't realize they are relying on constructivity to do anything at all. I've mostly only met other devs who have also worked in the CRDT space that grok intuitionistic reasoning.
“Semantic”?
Yes well we all have our pet theories about the world. Thankfully the people that think the natural numbers aren't "real" or that recursion isn't "real" haven't won out and destroyed our ability to make meaningful progress.
I'm curious about what you mean by recursion not being physically real? Do you mean it doesn't convert to CPU instructions, or something around occurence in nature?
There's no call stack in a tail-recursive function. It's equivalent to a loop.
And that, if you think about it a bit, might help you understand what recursion is, because your current understanding seems a bit off.
The only time you need a "call stack" with recursive calls is if the calls need to ultimately return to the call site and continue executing. It's still recursive.
By contrast, the model in which every function call pushes a return address, whether or not it's really needed, is a kind of mistake that comes from not understanding recursion.
Yes. This is correct, tail call optimized recursion is physically different than other types of recursion. It's much needed syntactic and structural sugar for goto.
OT but is this kind of well researched user content be disappearing with AI content filling the web ? Will some places / forums keep power users interested in spending time researching and answering ? When what they produce will be ingested in future models and not sourced back to them
I'm a very small time MSE user but indeed such concerns made me reluctant to answer recently or even participate in general. The CC BY-SA requires both Attribution and ShareAlike which doesn't happen. I expect people will move to private silos of information as much as possible.
Yeah but how do you make sure no robots subscribe to the private silos ?
Also, there's always some kind of minimal size required for spaces to be interesting for people to participate. And networks effects.
Legal means mostly and yeah indeed networks effects are the biggest obstacle.
I'm just an old country imperative programmer, not a big city functional programmer, so excuse my ignorance, but I had an ephiphany the other day. Functional programmers with strict types organize their data with complex types, while I do the same with classes. I need to do more research to verify this, but it explains the focus on types in functional languages to me.
Likewise here. After BASIC, I learned Pascal, in the early 80s. We were taught the principle that if you designed your data types right, the rest of the program would practically write itself.
What I can't tell you is if data types in computer programs are related to type theory.
Yeah I think there’s some confusion here.
When I hear “type theory” - I think Martin-Löf, Girard’s Proofs and Types, Curry-Howard, etc.
Which is not a “theory of programming language types” - although the subjects are not completely isolated from each other.
Then again, it’s been decades since I actually studied this stuff and haven’t really followed either subject since.
As far as the latter is concerned (programming language type systems), as an engineer, I’ve seen nothing in mainstream programming languages that advances the state-of-the-art compared to what was known back in the 1980s - when Hindley-Milner became mainstream and Luca Cardelli wrote the seminal survey papers on the subject.
Typically when an OOP programmer says the word ‘class’ they mean one (or both) of two things, depending on their favourite OOP language:
- a factory for objects, that hides some internal state of the object;
- a static construct that gives a name to some (‘public’) subset of the behaviour of those objects, and which a type-checker can then use to ensure that the use of the objects adheres to the intended contract.
They're both mechanisms for encapsulation of internal behaviour (that may change at any time) while providing a blessed public API that you promise will stay the same (or at least only change in certain ways, under a strict discipline like semver or ‘we'll tell you on the mailing list’). The difference is that the former is runtime (dynamic) behaviour, while the latter is compile-time (static) behaviour. The former maps to functions (or, with state, closures), not types, in functional languages (see ‘objects are a poor man's closures, and vice versa’ [1]), and the latter… is types :) The same typing concept that ensures correct interaction with objects can also be used to ensure correct interaction with functions or other data types, though since functional languages are easier to type than OO languages strictly-typed functional languages tend to have stronger type systems that capture more properties, like effects.
[1]: https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/...
From one perspective, there's a spectrum imperative <-> procedural <-> OOP <-> functional that is about how tightly scoped state changes are. In that view, functional languages are just the natural next step: if the whole point of OOP is to scope state, a functional language is just an OOP language with _even more_ tightly scoped state. The philosophy (uncontrolled state changes are scary and hard to reason about: add language constructs that allow the reader to see at a glance how far they have to look for the knock-on effects of a given state changes) is the same, just the implementation is different.
Thanks for the thoughtful response. As you know, I was drawing a parallel between the compile-time contract that a class provides and the role of complex types in functional programming. In many cases, a module that defines data types alongside functions operating on them feels similar to a class that encapsulates both structure and behavior. I realize there are deeper distinctions and advantages on the functional side, but I’m just trying to simplify and compare the basic conceptual building blocks. I also really liked your spectrum from imperative to functional. It's a helpful framing.
No you're totally right! In fact I think you're even more right than you said in the original post: the analogy you draw isn't just an analogy, they are in fact the exact same thing. A class is (amongst other things) a module, and a closure, and a type. In functional programming these things are (usually!) deconstructed rather than all being facets of the ‘class’ concept, but they're all still there and they all work exactly the same way as in OOP (perhaps modulo mutability, depending on choice of languages — but you can see local mutability as syntax sugar over function calls anyway).
One of the core concepts in type theory is the function type. It's a kind of building block. So perhaps better would be to say functional programmers organize everything using functions.
I suggest you look at simply typed lambda calculus, it gives a basic rationalization of types in functional programming.
In a statically-typed language like Java, every class is a type - though not vice-versa, since things like interfaces and records (in Java) are also types.
Non-OOP statically typed languages are basically just saying "ok that's fine, but there's no need to attach state or behaviour to types".
The OP goes on to immediate ask whether category theory or type theory can serve as a foundation for mathematics (they give a link to an article that talks of "foundational styles" but which doesn't give a process of explicitly "cutting the cord" from set theory).
I think the answer by (perhaps infamous) Paul Taylor in this mathoverflow post[1] is a clearer expression of the positions of those who want to jettison traditional set theory as the foundation of mathematics.
"The crucial issue is whether you are taken in by the Great Set-Theoretic Swindle that mathematics depends on collections (completed infinities)"
Yes, it's important to remember that everything labeled "infinity" in math doesn't "exist-exist" but rather is some combination of unending process and arbitrary labeling (see Skolem's paradox etc). But that doesn't mean such nominally completed infinities aren't useful or even necessary for a human being to grasp mathematics.
Anyway, for this debate it's worth reading all of Taylor's comment (which isn't to say I share his position but it's good to understand where the "finitests" are coming from).
[1] https://mathoverflow.net/questions/8731/categorical-foundati...
I don’t quite buy this, because we want to abstract over the stuff reachable by a particular generic process, say, over the natural numbers. We want to say, in mathematical language (i.e. as formulas), things like “whatever x we can reach in this kind of process, property P will hold for it”. This requires abstracting over all possible x, and in that abstraction, “all possible x” becomes itself an object. And we call those kinds of objects sets.
Where does one even start learning type theory or category theory? I’ve had a passing interest from listening in on conversation between Rust friends, but never found a place to do a like type theory 101 for software engineers
Books. Some books on my shelf covering these topics:
"Type Theory and Functional Programming" by Simon Thompson (PDF available via quick search)
First 120 pages of "Type Theory and Formal Proof" by Rob Nederpelt / Herman Geuvers
For some early stuff on dependent types (also covered in the above), check out "Intuitionistic Type Theory" by Per Martin-Lof
"Computation and Reasoning: A Type Theory for Computer Science" by Zhaouhui Luo
"Type-Driven Development with Idris" by Edwin Brady
"Gentle Introduction to Dependent Types with Idris" by Boro Sitnikovski
"Types and Programming Languages" by Pierce
"Practical Foundations for Programming Languages" by Robert Harper (or any of his YouTube videos re/Type Theory)
"The Little Typer" by Friedman and Christiansen
"Modal Homotopy Type Theory: The Prospect of a New Logic for Philosophy" by David Corfield (just took this one to OBX North Carolina to enjoy on my beach vacation last week! Got some salt water in those pages. Also speaks to Category Theory.)
For Category Theory:
"The Joy of Abstraction" (2023) by Euginia Cheng
"Basic Category Theory for Computer Scientists" by Benjamin C. Pierce
"Conceptual Mathematics: A first introduction to categories" by Lawvere and Schanuel
"Categories for the Working Philosopher" by Landry
Bartosz Milewski’s Category Theory for Programmers lectures [1] are good. Especially paired with his book [2].
I actually find that category theory is remarkably simple. At least at the level where it applies to day-to-day programming concepts. The one major benefit I got from it, was that it helped me reason about high level/abstract structure better (which helps my architectural designs/thinking).
[1] https://youtube.com/playlist?list=PLbgaMIhjbmEnaH_LTkxLI7FMa...
[2] https://github.com/hmemcpy/milewski-ctfp-pdf
I enjoyed Types and Programming Languages by Benjamin Pierce. Also vaguely related is the Implementation of Functional Programming Languages by Simon Peyton Jones (this is more about lambda calculus, but I found that when you pair type theory with lambda calculus you get some nice synergies as a programmer).
I watched these https://www.youtube.com/@hottlectures5237, at least up to the point where they seemed to be diverging from things I could see application in. There's an associated book which is fairly tough going but enlightening (easier after watching some of the videos).
Let's say I ask, "Why is it worth spending time in Baltimore?"
I could get answers about the national anthem, about Camden Yards, about good restaurants and bars. Those are "tourist" answers.
Or I could get answers about job availability and cost of housing. Those are "move there" answers.
It seems to me that this article (if you can call it that) gives mostly "tourist" answers, not "move there" answers. Well, that's all right in a way - for an academic topic, you probably have to get tourists before you get people moving there. But the problem is that mathematicians have a large number of areas that they could visit as tourists, and nothing here tells them why type theory is better to visit than, say, algebraic topology.
And that doesn't solve the original complaint. Why are so few people working in type theory? Because more of the action is in set theory and category theory, and type theory offers too little that the others don't, so nobody's going to switch.
I mean, most mathematicians aren't really in foundations anyway. They're in differential equations or prime numbers or complex analysis or algebraic topology or something like that, and making your foundation type theory instead of set theory makes no difference whatsoever to those people. Only the people working on foundations care, and as I said, most of those people already work in set theory or category theory, and type theory doesn't give them a good enough reason to uproot and move.
I’m not sure Set theory and Type theory compete for the same foundations people.
Type theory’s stomping ground is in the foundations of computation. I’m guessing if you sampled a theoretical comp. sci. audience, I’m pretty sure type theory would “win” the popularity contest.
Somewhat similar to intuitionism and constructivism - interesting to people interested in computability but considered almost silly or at least niche by “mainstream” math theoreticians.
> Somewhat similar to intuitionism and constructivism - interesting to people interested in computability but considered almost silly or at least niche by “mainstream” math theoreticians.
This was a source of minor culture clash for me: I spent some time over the past couple of years working through some Coq-based educational materials (mainly Software Foundations led by Pierce) and talking to one of the grad student coauthors about his (PL theory/type theory) research. So I rather strongly internalized from this material that "we" don't automatically assume the Law of the Excluded Middle, but at least warn people if we had to use it (as it makes proofs nonconstructive: you no longer have a method for calculating the objects you are talking about). After this, I commented on something on Scott Aaronson's blog and I happened to say something like "wait, but doesn't that require excluded middle, and you didn't explicitly say so?" and Aaronson was rather confused, basically saying something like "we normal mathematicians assume excluded middle all the time, and we don't say so".
So yeah, there is a strong trend in favor of those things in some parts of type theory, but other mathematicians are then surprised that anyone would care. (And Aaronson in particular is someone very interested in and expert about computability, though one might say from a more traditional core computer science perspective.)
I'm going to reply to my own comment to make a digression.
It's an intuitionistic logic theorem that the excluded middle is "irrefutable", that is, that you can never exhibit a counterexample. You can prove this constructively! This leads to something called the double negation translation, where all theorems that could be proved using excluded middle can be proven constructively to be "irrefutable" (rather than "true"), that is, that it would be impossible for an exception or counterexample to the theorem to exist. So you will get ~~P where classical logic could establish P (and deducing P from ~~P is equivalent to excluded middle).
One argument for not assuming excluded middle in proofs is then that the deductive system automatically shows you exactly which conclusions are constructive and which are not constructive, and you can interpret your deductive system in such a way that you're getting the same answers, just with this additional information or metadata about whether the answer was obtained constructively or nonconstructively. If you prove P, you did it constructively (you can exhibit the objects that P talks about, or extract an algorithm for locating or identifying them); if you prove ~~P, you did it nonconstructively, if "it" is considered as "proving P itself" (your argument shows that it would be impossible for the objects in question definitively not to exist, but you don't necessarily know how to find or calculate them).
So one can say, again, that in working this way you're learning the same stuff about your underlying subject matter, but with additional information attached at the end!
This is described in a more formal way at
https://en.wikipedia.org/wiki/Double-negation_translation
Fair enough, though TFA seemed to be focused on the "foundations of mathematics" angle. If you want to say it's good for foundations of CS... I can probably agree (though the category theory folks may want a word).
It is not worth spending time on type theory. It is quite meaningless, literally.