“we hold these truths to be self-evident” wasn’t Jefferson’s line; his first draft of the Declaration has “we hold these truths to be sacred & undeniable.” It was Ben Franklin who scratched out those words and wrote “self-evident” instead, making the document a little less biblical, a little more Euclidean.
This short introduction to category theory is for readers with relatively little mathematical background. At its heart is the concept of a universal property, important throughout mathematics. After a chapter introducing the basic definitions, separate chapters present three ways of expressing universal properties: via adjoint functors, representable functors, and limits. A final chapter ties the three together.
For each new categorical concept, a generous supply of examples is provided, taken from different parts of mathematics. At points where the leap in abstraction is particularly great (such as the Yoneda lemma), the reader will find careful and extensive explanations.
Tom Leinster has released a digital e-book copy of his textbook Basic Category Theory on arXiv. 
My friend Tom Leinster has written a great introduction to that wonderful branch of math called category theory! It’s free:
It starts with the basics and it leads up to a trio of related concepts, which are all ways of talking about universal properties.
Huh? What’s a ‘universal property’?
In category theory, we try to describe things by saying what they do, not what they’re made of. The reason is that you can often make things out of different ingredients that still do the same thing! And then, even though they will not be strictly the same, they will be isomorphic: the same in what they do.
A universal property amounts to a precise description of what an object does.
Universal properties show up in three closely connected ways in category theory, and Tom’s book explains these in detail:
through representable functors (which are how you actually hand someone a universal property),
through limits (which are ways of building a new object out of a bunch of old ones),
through adjoint functors (which give ways to ‘freely’ build an object in one category starting from an object in another).
If you want to see this vague wordy mush here transformed into precise, crystalline beauty, read Tom’s book! It’s not easy to learn this stuff – but it’s good for your brain. It literally rewires your neurons.
Here’s what he wrote, over on the category theory mailing list:
My introductory textbook “Basic Category Theory” was published by Cambridge University Press in 2014. By arrangement with them, it’s now also free online:
It’s also freely editable, under a Creative Commons licence. For instance, if you want to teach a class from it but some of the examples aren’t suitable, you can delete them or add your own. Or if you don’t like the notation (and when have two category theorists ever agreed on that?), you can easily change the Latex macros. Just go the arXiv, download, and edit to your heart’s content.
There are lots of good introductions to category theory out there. The particular features of this one are:
• It’s short.
• It doesn’t assume much.
• It sticks to the basics.
Logical probability theory was developed as a quantitative measure based on Boole’s logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. But a recent development in logic changes this situation. In category theory, the notion of a subset is dual to the notion of a quotient set or partition, and recently the logic of partitions has been developed in a parallel relationship to the Boolean logic of subsets (subset logic is usually mis-specified as the special case of propositional logic). What then is the quantitative measure based on partition logic in the same sense that logical probability theory is based on subset logic? It is a measure of information that is named “logical entropy” in view of that logical basis. This paper develops the notion of logical entropy and the basic notions of the resulting logical information theory. Then an extensive comparison is made with the corresponding notions based on Shannon entropy.
Ellerman is visiting at UC Riverside at the moment. Given the information theory and category theory overlap, I’m curious if he’s working with John Carlos Baez, or what Baez is aware of this.
Based on a cursory look of his website(s), I’m going to have to start following more of this work.