You are currently browsing the tag archive for the ‘heyting algebra’ tag.

In this installment, I will introduce the concept of Boolean algebra, one of the main stars of this series, and relate it to concepts introduced in previous lectures (distributive lattice, Heyting algebra, and so on). Boolean algebra is the algebra of classical propositional calculus, and so has an abstract logical provenance; but one of our eventual goals is to show how any Boolean algebra can also be represented in concrete set-theoretic (or topological) terms, as part of a powerful categorical duality due to Stone.

There are *lots* of ways to define Boolean algebras. Some definitions were for a long time difficult conjectures (like the Robbins conjecture, established only in the last ten years or so with the help of computers) — testament to the richness of the concept. Here we’ll discuss just a few definitions. The first is a traditional one, and one which is pretty snappy:

A

Boolean algebrais a distributive lattice in which every element has a complement.

(If is a lattice and , a *complement* of is an element such that and . A lattice is said to be *complemented* if every element has a complement. Observe that the notions of complement and complemented lattice are manifestly self-dual. Since the notion of distributive lattice is self-dual, so therefore is the notion of Boolean algebra.)

**Example**: Probably almost everyone reading this knows the archetypal example of a Boolean algebra: a power set , ordered by subset inclusion. As we know, this is a distributive lattice, and the complement of a subset satisfies and .**Example**: Also well known is that the Boolean algebra axioms mirror the usual interactions between conjunction , disjunction , and negation in ordinary classical logic. In particular, given a theory , there is a preorder whose elements are sentences (closed formulas) of , ordered by if the entailment is provable in using classical logic. By passing to logical equivalence classes ( iff in ), we get a poset with meets, joins, and complements satisfying the Boolean algebra axioms. This is called the*Lindenbaum algebra*of the theory .

**Exercise**: Give an example of a complemented lattice which is *not* distributive.

As a possible leading hint for the previous exercise, here is a first order of business:

**Proposition**: In a distributive lattice, complements of elements are unique when they exist.

**Proof**: If both and are complementary to , then . Since , we have . Similarly , so

The definition of Boolean algebra we have just given underscores its self-dual nature, but we gain more insight by packaging it in a way which stresses adjoint relationships — Boolean algebras are the same things as special types of Heyting algebras (recall that a Heyting algebra is a lattice which admits an implication operator satisfying an adjoint relationship with the meet operator).

**Theorem**: A lattice is a Boolean algebra if and only if it is a Heyting algebra in which either of the following properties holds:

- if and only if
- for all elements

**Proof**: First let be a Boolean algebra, and let denote the complement of an element . Then I **claim** that if and only if , proving that admits an implication . Then, taking , it follows that , whence 1. follows. Also, since (by definition of complement) is the complement of if and only if is the complement of , we have , whence 2. follows.

[Proof of claim: if , then . On the other hand, if , then . This completes the proof of the claim and of the forward implication.]

In the other direction, given a lattice which satisfies 1., it is automatically a Heyting algebra (with implication ). In particular, it is distributive. From , we have (from 1.) ; since is automatic by definition of , we get . From , we have also (from 1.) that ; since is automatic by definition of , we have . Thus under 1., every element has a complement .

On the other hand, suppose is a Heyting algebra satisfying 2.: . As above, we know . By the corollary below, we also know the function takes 0 to 1 and joins to meets (De Morgan law); since condition 2. is that is its own inverse, it follows that also takes meets to joins. Hence . Thus for a Heyting algebra which satisfies 2., every element has a complement . This completes the proof.

**Exercise**: Show that Boolean algebras can also be characterized as meet-semilattices equipped with an operation for which if and only if .

The proof above invoked the De Morgan law . The claim is that *this* De Morgan law (not the other !) holds in a general Heyting algebra — the relevant result was actually posed as an exercise from the previous lecture:

**Lemma**: For any element of a Heyting algebra , the function is an order-reversing map (equivalently, an order-preserving map , or an order-preserving map ). It is **adjoint to itself**, in the sense that is right adjoint to .

**Proof**: First, we show that in (equivalently, in ) implies . But this conclusion holds iff , which is clear from . Second, the adjunction holds because

in if and only if

in if and only if

in if and only if

in if and only if

in

**Corollary**: takes any inf which exists in to the corresponding inf in . Equivalently, it takes any sup in to the corresponding inf in , i.e., . (In particular, this applies to finite joins in , and in particular, it applies to the case , where we conclude, e.g., the De Morgan law .)

**Remark**: If we think of sups as sums and infs as products, then we can think of implications as behaving like exponentials . Indeed, our earlier result that preserves infs can then be recast in exponential notation as saying , and our present corollary that takes sups to infs can then be recast as saying . Later we will state another exponential law for implication. It is correct to assume that this is no notational accident!

Let me reprise part of the lemma (in the case ), because it illustrates a situation which comes up over and over again in mathematics. In part it asserts that is order-reversing, and that there is a three-way equivalence:

if and only if if and only if .

This situation is an instance of what is called a “Galois connection” in mathematics. If and are posets (or even preorders), a *Galois connection* between them consists of two order-reversing functions , such that for all , we have if and only if . (It’s actually an instance of an adjoint pair: if we consider as an order-preserving map and an order-preserving map , then in if and only if in .)

Here are some examples:

- The original example arises of course in Galois theory. If is a field and is a finite Galois extension with Galois group (of field automorphisms which fix the elements belonging to ), then there is a Galois connection consisting of maps and . This works as follows: to each subset , define to be . In the other direction, to each subset , define to be . Both and are order-reversing (for example, the larger the subset , the more stringent the conditions for an element to belong to ). Moreover, we have

iff ( for all ) iff

so we do get a Galois connection. It is moreover clear that for any , is an intermediate subfield between and , and for any , is a subgroup of . A principal result of Galois theory is that and are inverse to one another when restricted to the lattice of subgroups of and the lattice of fields intermediate between and . Such a bijective correspondence induced by a Galois connection is called a

*Galois correspondence*. - Another basic Galois connection arises in algebraic geometry, between subsets (of a polynomial algebra over a field ) and subsets . Given , define (the
*zero locus*of ) to be . On the other hand, define (the*ideal*of ) to be . As in the case of Galois theory above, we clearly have a three-way equivalence

iff ( for all ) iff

so that , define a Galois connection between power sets (of the -variable polynomial algebra and of -dimensional affine space ). One defines an (affine algebraic)

*variety*to be a zero locus of some set. Then, on very general grounds (see below), any variety is the zero locus of its ideal. On the other hand, notice that is an ideal of the polynomial algebra. Not every ideal of the polynomial algebra is the ideal of its zero locus, but according to the famous Hilbert Nullstellensatz, those ideals equal to their radical are. Thus, and become inverse to one another when restricted to the lattice of varieties and the lattice of radical ideals, by the Nullstellensatz: there is a Galois correspondence between these objects. - Both of the examples above are particular cases of a very general construction. Let be sets and let be any relation between them. Then set up a Galois connection which in one direction takes a subset to , and in the other takes to . Once again we have a three-way equivalence

iff iff .

There are

*tons*of examples of this flavor.

As indicated above, a Galois connection between posets is essentially the same thing as an adjoint pair between the posets (or between if you prefer; Galois connections are after all symmetric in ). I would like to record a few basic results about Galois connections/adjoint pairs.

**Proposition**:

- Given order-reversing maps , which form a Galois connection, we have for all and for all . (Given poset maps which form an adjoint pair , we have for all and for all .)
- Given a Galois connection as above, for all and for all . (Given an adjoint pair as above, the same equations hold.) Therefore a Galois connection induces a Galois correspondence between the elements of the form and the elements of the form .

**Proof**: (1.) It suffices to prove the statements for adjoint pairs. But under the assumption , if and only if , which is certainly true. The other statement is dual.

(2.) Again it suffices to prove the equations for the adjoint pair. Applying the order-preserving map

to from 1. gives . Applying from 1. to gives . Hence . The other equation is dual.

Incidentally, the equations of 2. show why an algebraic variety is the zero locus of its ideal (see example 2. above): if for some set of polynomials , then . They also show that for any element in a Heyting algebra, we have , even though is in general false.

Let be a Galois connection (or an adjoint pair). By the proposition, is an order-preserving map with the following properties:

for all

for all .

Poset maps with these properties are called *closure operators*. We have earlier discussed examples of closure operators: if for instance is a group, then the operator which takes a subset to the subgroup generated by is a closure operator. Or, if is a topological space, then the operator which takes a subset to its topological closure is a closure operator. Or, if is a poset, then the operator which takes to is a closure operator. Examples like these can be multiplied at will.

One virtue of closure operators is that they give a useful means of constructing new posets from old. Specifically, if is a closure operator, then a *fixed point* of (or a *-closed* element of ) is an element such that . The collection of fixed points is partially ordered by the order in . For example, the lattice of fixed points of the operator above is the lattice of subgroups of . For any closure operator , notice that is the same as the image of .

One particular use is that the fixed points of the double negation closure on a Heyting algebra form a Boolean algebra , and the map is a Heyting algebra map. This is not trivial! And it gives a means of constructing some rather exotic Boolean algebras (“atomless Boolean algebras”) which may not be so familiar to many readers.

The following exercises are in view of proving these results. If no one else does, I will probably give solutions next time or sometime soon.

**Exercise**: If is a Heyting algebra and , prove the “exponential law” . Conclude that .

**Exercise**: We have seen that in a Heyting algebra. Use this to prove .

**Exercise**: Show that double negation on a Heyting algebra preserves finite meets. (The inequality is easy. The reverse inequality takes more work; try using the previous two exercises.)

**Exercise**: If is a closure operator, show that the inclusion map is right adjoint to the projection to the image of . Conclude that meets of elements in are calculated as they would be as elements in , and also that preserves joins.

**Exercise**: Show that the fixed points of the double negation operator on a topology (as Heyting algebra) are the *regular* open sets, i.e., those open sets equal to the interior of their closure. Give some examples of non-regular open sets. Incidentally, is the lattice you get by taking the opposite of a topology also a Heyting algebra?

Last time in this series on Stone duality, we introduced the concept of lattice and various cousins (e.g., inf-lattice, sup-lattice). We said a lattice is a poset with finite meets and joins, and that inf-lattices and sup-lattices have arbitrary meets and joins (meaning that every subset, not just every finite one, has an inf and sup). Examples include the poset of all subsets of a set , and the poset of all subspaces of a vector space .

I take it that most readers are already familiar with many of the properties of the poset ; there is for example the distributive law , and De Morgan laws, and so on — we’ll be exploring more of that in depth soon. The poset , as a lattice, is a much different animal: if we think of meets and joins as modeling the logical operations “and” and “or”, then the logic internal to is a weird one — it’s actually much closer to what is sometimes called “quantum logic”, as developed by von Neumann, Mackey, and many others. Our primary interest in this series will be in the direction of more familiar forms of logic, *classical logic* if you will (where “classical” here is meant more in a physicist’s sense than a logician’s).

To get a sense of the weirdness of , take for example a 2-dimensional vector space . The bottom element is the zero space , the top element is , and the rest of the elements of are 1-dimensional: lines through the origin. For 1-dimensional spaces , there is no relation unless and coincide. So we can picture the lattice as having three levels according to dimension, with lines drawn to indicate the partial order:

V = 1 / | \ / | \ x y z \ | / \ | / 0

Observe that for distinct elements in the middle level, we have for example (0 is the largest element contained in both and ), and also for example (1 is the smallest element containing and ). It follows that , whereas . The distributive law fails in !

**Definition**: A lattice is *distributive* if for all . That is to say, a lattice is distributive if the map , taking an element to , is a morphism of join-semilattices.

**Exercise**: Show that in a meet-semilattice, is a poset map. Is it also a morphism of meet-semilattices? If has a bottom element, show that the map preserves it.**Exercise**: Show that in any lattice, we at least have for all elements .

Here is an interesting theorem, which illustrates some of the properties of lattices we've developed so far:

**Theorem**: The notion of distributive lattice is self-dual.

**Proof**: The notion of lattice is self-dual, so all we have to do is show that the dual of the distributivity axiom, , follows from the distributive lattice axioms.

Expand the right side to , by distributivity. This reduces to , by an absorption law. Expand this again, by distributivity, to . This reduces to , by the other absorption law. This completes the proof.

Distributive lattices are important, but perhaps even more important in mathematics are lattices where we have not just finitary, but infinitary distributivity as well:

**Definition**: A *frame* is a sup-lattice for which is a morphism of sup-lattices, for every . In other words, for every subset , we have , or, as is often written,

**Example**: A power set , as always partially ordered by inclusion, is a frame. In this case, it means that for any subset and any collection of subsets , we have

This is a well-known fact from naive set theory, but soon we will see an alternative proof, thematically closer to the point of view of these notes.

**Example**: If is a set, a *topology* on is a subset of the power set, partially ordered by inclusion as is, which is closed under finite meets and arbitrary sups. This means the empty sup or bottom element and the empty meet or top element of are elements of , and also:

- If are elements of , then so is .
- If is a collection of elements of , then is an element of .

A *topological space* is a set which is equipped with a topology ; the elements of the topology are called *open subsets* of the space. Topologies provide a primary source of examples of frames; because the sups and meets in a topology are constructed the same way as in (unions and finite intersections), it is clear that the requisite infinite distributivity law holds in a topology.

The concept of topology was originally rooted in analysis, where it arose by contemplating very generally what one means by a "continuous function". I imagine many readers who come to a blog titled "Topological Musings" will already have had a course in general topology! but just to be on the safe side I'll give now one example of a topological space, with a promise of more to come later. Let be the set of -tuples of real numbers. First, define the open ball in centered at a point and of radius to be the set < . Then, define a subset to be *open* if it can be expressed as the union of a collection, finite or infinite, of (possibly overlapping) open balls; the topology is by definition the collection of open sets.

It's clear from the definition that the collection of open sets is indeed closed under arbitrary unions. To see it is closed under finite intersections, the crucial lemma needed is that the intersection of two overlapping open balls is itself a union of smaller open balls. A precise proof makes essential use of the *triangle inequality*. (Exercise?)

Topology is a huge field in its own right; much of our interest here will be in its interplay with *logic*. To that end, I want to bring in, in addition to the connectives "and" and "or" we've discussed so far, the *implication connective* in logic. Most readers probably know that in ordinary logic, the formula (" implies ") is equivalent to "either not or " -- symbolically, we could define as . That much is true -- in ordinary Boolean logic. But instead of committing ourselves to this reductionistic habit of defining implication in this way, or otherwise relying on Boolean algebra as a crutch, I want to take a fresh look at material implication and what we really ask of it.

The main property we ask of implication is *modus ponens*: given and , we may infer . In symbols, writing the inference or entailment relation as , this is expressed as . And, **we ask that implication be the weakest possible** **such assumption**, i.e., that material implication be the weakest whose presence in conjunction with entails . In other words, for given and , we now define implication by the property

if and only if

As a very easy exercise, show by Yoneda that an implication is uniquely determined when it exists. As the next theorem shows, not all lattices admit an implication operator; in order to have one, it is necessary that distributivity holds:

**Theorem**:

- (1) If is a meet-semilattice which admits an implication operator, then for every element , the operator preserves any sups which happen to exist in .
- (2) If is a frame, then admits an implication operator.

**Proof**: (1) Suppose has a sup in , here denoted . We have

if and only if

if and only if

for all if and only if

for all if and only if

.

Since this is true for all , the (dual of the) Yoneda principle tells us that , as desired. (We don't need to add the hypothesis that the sup on the right side exists, for the first four lines after "We have" show that satisfies the defining property of that sup.)

(2) Suppose are elements of a frame . Define to be . By definition, if , then . Conversely, if , then

where the equality holds because of the infinitary distributive law in a frame, and this last sup is clearly bounded above by (according to the defining property of sups). Hence , as desired.

Incidentally, part (1) this theorem gives an alternative proof of the infinitary distributive law for Boolean algebras such as , so long as we trust that really does what we ask of implication. We'll come to that point again later.

Part (2) has some interesting consequences *vis à vis* topologies: we know that topologies provide examples of frames; therefore by part (2) they admit implication operators. It is instructive to work out exactly what these implication operators look like. So, let be open sets in a topology. According to our prescription, we define as the sup (the union) of all open sets with the property that . We can think of this inclusion as living in the power set . Then, assuming our formula for implication in the Boolean algebra (where denotes the complement of ), we would have . And thus, our implication *in the topology* is the union of all open sets contained in the (usually non-open) set . That is to say, is the *largest* open contained in , otherwise known as the *interior* of . Hence our formula:

= int

**Definition**: A *Heyting algebra* is a lattice which admits an implication for any two elements . A *complete Heyting algebra* is a complete lattice which admits an implication for any two elements.

Again, our theorem above says that frames are (extensionally) the same thing as complete Heyting algebras. But, as in the case of inf-lattices and sup-lattices, we make *intensional* distinctions when we consider the appropriate notions of morphism for these concepts. In particular, a *morphism of frames* is a poset map which preserves finite meets and arbitrary sups. A *morphism of Heyting algebras* preserves all structure in sight (i.e., all implied in the definition of Heyting algebra -- meets, joins, and implication). A *morphism of complete Heyting algebras* also preserves all structure in sight (sups, infs, and implication).

Heyting algebras are usually **not** Boolean algebras. For example, it is rare that a topology is a Boolean lattice. We'll be speaking more about that next time soon, but for now I'll remark that Heyting algebra is the algebra which underlies *intuitionistic propositional calculus*.

**Exercise**: Show that in a Heyting algebra.

**Exercise**: (For those who know some general topology.) In a Heyting algebra, we define the negation to be . For the Heyting algebra given by a topology, what can you say about when is open and *dense*?

## Recent Comments