I am currently reading Robert Causey's Logic, Sets and Recursion, which is intended to bridge the gap between introductory freshman/sophomore "introductory discrete math" courses and senior level mathematical logic. I am only about a third of the way through it right now, but I have already learned several useful things, I especially found the discussion of mathematical definition of "relations" enlightening. The biggest complaint I have about it, is that it uses different terminology from the previous logic I have read, "sentential logic" rather than "propositional logic" for example, but at least it mentions the alternative terms when it introduces its own.
We use these lecture notes to teach the first semester of honors CS at Stony Brook. They're great (though some of the examples are a bit tedious---the class just finished Counting III and there are a bit too many card trick examples in that one).
(by "we" I mean the professor that teaches it, I'm just helping out this semester)
I am taking a discrete math course as a non-degree student and just found this text today. We are using Rosen's book in class, but I must say that this free text does a much better job of giving you the intuition behind what you are doing. This is an excellent resource for any discrete math student.
Second time this semester my textbook's been posted on HN. Go figure. Does this link make it up every couple of months? I thought there was something preventing duplicates.
(I am in the class, that is; I didn't write the textbook).
I've been using this to prepare for Discrete Math. It's pretty interesting; the authors do a good job of putting mathematical concepts in historical contexts, which keeps me engaged.
Discrete math and probability have been done to death as computer science-related topics. You cannot get far in a CS program without learning these topics forward and back. That being said, this book is good as a starting point that collects these topics together; I have not seen a similar book that does the same job.
What's not getting enough attention as "mathematics for computer science" is category theory. Put very roughly: it gives you some very powerful formal patterns that will crop up in every computer science-related thing you do. This includes everything from theoretical computer science to the most 'boring' of dead-end programming jobs. Think of it as software engineering/design patterns without all the informal handwaving and zealotry for/against <latest fad process or design pattern> that it allows.
Category theory, if properly learned with an eye toward application, will make you become utterly ashamed of the programs you write, and the languages you write them in. The end result is a program that still does not match the one in your head built with a much more elegant set of abstractions, but is very clear nonetheless.
Any suggestions for a good Category Theory texts that're oriented to CS and go beyond the "classical" connections between cartesian closed categories and lambda calculi? Or am I stuck in another frustrating 2-week long session trying to divine "Category Theory for the working mathematician" and a hand-full of research papers and blogs on Haskell?
Edit: Sorry if I sound bitter, but I've been down the category theory route multiple times in the past, starting in graduate school (in Math) and I've always given up before I "got" the divine revelation that others have talked about.
I don't know any way round the fact that category theory is difficult to learn. I think that's partly inherent in the subject, not helped by the lack of a really good textbook suitable for beginners.
Yeah, I came across Barr and Wells 5 years ago in grad school and found it almost impossible to buy. I ended up getting a copy from the library and photocopying and rebinding it. Like you said, it was ok, but hardly comprehensive. Maybe it's easier to find now.
Anyway, nice paper..and thanks for making me think of category theory again. Maybe I'll pull out my old copy of "CT for the working mathematician" and give it (and haskell) another go.
I think you can buy the Barr and Wells book via the web page I gave. In fact I think that's how I got my copy, which must have been around the same time you were looking for it.
I haven't read Walters book, & would also be interested to hear reactions from anyone here who has.
Because it was not mentioned until now, one of the following might probably come in handy:
- Basic Category Theory for Computer Scientists by Benjamin C. Pierce (from "Types and Programming Languages" fame)
- Practical Foundations of Mathematics, by Paul Taylor (has other interesting work on his homepage as well; http://www.paultaylor.eu/~pt/prafm/; somebody on HN pointed me to it, thanks!--Chapters 4,5,7)
Do you know of an example that shows category theory being applied to solving a CS problem ?
From what little I've seen of it category theory seems to be more abstract than conventional abstract algebra and I've rarely heard of even group theory used in the development of an algorithm
I think it depends how broadly or narrowly you conceptualise computer science. I doubt category theory is much use for developing algorithms, but it's been an important influence on type systems for programming languages, especially modern functional languages but also more mainstream things like generic types in Java.
Taking your question literally (“category theory being applied to solving a CS problem”), I'd say the full abstraction problem for PCF is a good example of a problem in computer science whose solution leant heavily on category theory.
I thought group theory was the easy part. Indeed, that's pretty much what crypto is based on. Many other security applications in CS also rely on group-theoretic notions. Consider homomorphic encryption; the idea that one can develop an encryption algorithm that allows an untrusted source to work with your sensitive data without discovering your data itself, due to the fact that there exist 'one-way' functions from plaintext to gibberish that actually preserve certain operations you care about; i.e., if
encrypt(x) = x'
encrypt(y) = y'
encrypt(z) = z'
and x + y = z, homomorphic encryption for (+) gives you that x' + y' = z'.
Farther away from completely direct applications like these are algebraic ways of viewing a CS problem. Consider vectorizing a for loop that does in-place update of a variable with +=. We know that this can be turned into a parallel reduction, where if N is the length of the list over which the for loop is iterating, the parallel reduction will take O(log N) time. But what other operations does it work with? Associative ones? What can we do if it's not associative, or the return type of the in-place update is different? And so on.
That's pretty much the tip of the iceberg when it comes to abstract algebra being used in CS.
Now on to category theory. I'm still very, very new to it, but I've been seeing tons of connections anyway. I figure I can view the higher level of abstractness two different ways; that it's going to be hard and impractical to make any connection to anything concrete; or, that excitingly many things admit a categorical view precisely because of how general the theory is.
Let's start with the category-theoretic notion of functors; functions taking functions to functions. Never going to see these outside of functional programming, right? You've worked with functors if you've ever spent any significant time in a Unix shell.
Indeed, 'sudo' can be seen as a functor, taking commands (handwavingly, morphisms from the set of user-privilege states to itself) to higher-privileged commands. Let's check the idea with respect to the functor laws. Does sudo <cmd1>; sudo <cmd2> behave the same as sudo cmd3; (where cmd3 is a script that does <cmd1>;<cmd2>). What is sudo sudo cmd? Now think of any Unix command modifier you would care to write. How would it satisfying the functor laws make it an easier command modifier to use?
Also important is the notion of initial objects in the context of proving properties about the behavior of any algorithm on any data structure (including an interpreter, on a programming language!). It generalizes the notion of a 'starting object' in the context of a set of objects related to each other by transformations; like the empty list in the set of lists related by concatenation. What's useful about this? Proving things on structures that do come from initial objects admit easy proofs by induction. Moreover, if you can come up with the proper set + transformations, you can come up with very 'useful' initial objects for whatever you're trying to prove.