Posts tagged with semantics

A tiny portion of Doug Hofstadter’s “semantic network”.
via jewcrew728, structure of entropy


I’m working on a longish post about dichotomies. It’s going to be about mathematical objects that can serve as metaphors to think beyond binary opposition.

In researching the article, I found the following in the Internet Encyclopedia:

According to Jacques Derrida,[citation needed] meaning in the West is defined in terms of binary oppositions, “a violent hierarchy” where “one of the two terms governs the other.”

I don’t know if Derrida actually said that. But I can already think of a counterexample from mathematics.


The number √−1 is logically equivalent to √−1. In other words i and −i are indistinguishable.

Doug Hofstadter was fond of making this point to us.

  • Complex conjugation would work the same.
  • Addition, subtraction, multiplication, and division would work the same.
  • The anticlockwise direction in the complex plane is arbitrary. If the “southern” i were the one we currently call +i, then we’d do things clockwise and everything would work out the same.
  • So integration and differentiation would work the same as well.
  • (On the other hand, −1 is not the same as +1−1 instantiates an “alternating” pattern whereas +1 instantiates a “stay the same” pattern, under multiplication.)
  • It’s like group theory. Say we’re talking about the group P₃. Any of the atoms could be called “first”, “second”, or “third”. It wouldn’t matter.

    What matters is the structure, the relationships, the way they do things. Neither is “worse”, “better”, “before”, “after”, or “dominated by” the others—they simply relate to each other in the P₃ way.

So right there, you’ve got a binary opposition where neither term governs the other.

Ever since I took too many mathematics classes, I started using the concept of “upper bound” literally. It confuses people.

  • Girlfriend: How long do you think it’ll take you to work out?
  • Me: Oh, I don’t know. I’d say less than five hours.
  • Girlfriend: Five hours?! What are you planning to do there?
  • Me: I didn’t say it would take five hours. I said it would take less than five hours.

Well, it did take me less than five hours. It took me an hour and a half.


Same problem with confidence intervals — I give these very literal answers. My former boss told me a story about a “rationality test” she was given, by a statistician or something. First she was asked to guess some fact that only a 5th grader would know, like how many tons the moon weighs or what’s the square mileage of Antarctica. Then the statistician asked her to give 90% confidence bounds. That’s, you’re 90% sure that the value is between these two numbers. Most people fail by saying numbers close to their original guess.


My boss just said, "I’m confident that the number is somewhere between zero and a trillion trillion trillion trillion trillion trillion trillion … trillion." Well she was correct! And she was one of the only ones.

You can make confidence bounds as wide as you want and be logically correct. People will look at you in bewilderment when you say things like “I’ll be gone somewhere between two seconds and seven days,” but you will not be a liar.


Mathematicians are the only people who go around making statements like “I found out that the answer is greater than 6 and less than 3→3→64→2. I’m pretty sure the answer is 13, though.

(3→3→64→2 is bigger than billions of universes.)


It’s not even the difference between certainty-of-proof and casual guesstimation. It’s the difference between giving an upper bound and giving the least upper bound (supremum).

I can say with complete confidence that I will never earn over 10^18737 pounds in my life, no matter how much hyperinflation or life extending medicines lie in the future. I can say the same about 10^18736 pounds. How low am I willing to go with these statements? Ay, there’s the lub.


No less of an intellect than Paul Graham swaps upper and lower bounds. In describing how he’s designing a new programming language (arc) with the goal that useful programs should be as short as possible in it, he writes:

That’s part of why I focus on code size. Length is an external constraint. If you start looking at code thinking "what is the lower bound on how long this has to be?" you’re one step from discovering the new operator that will make it that short.

I might be the only person who reads this and is confused. When I hear “lower bound” I think “Nothing is lower than the lower bound. It has to be bigger than the lower bound.” But then he is talking like putting a lower bound on the code means the code is shorter. Zoinks?

And then I’m like, oh. Duh. He means “the lower bound on the upper bound on how long this has to be.” Supremum. Well you could have just said that, Paul. Or I could not think so literally.