# Lanchester’s Laws

In 1916 English engineer Frederick Lanchester set out to find a mathematical model to describe conflicts between two armies. In ancient times, he reasoned, each soldier engaged with one enemy at a time, so the number of soldiers who survived a battle was simply the difference in size between the two armies. But the advent of modern combat, including long-range weapons such as firearms, changes things. Suppose two armies, A and B, are fighting. A and B represent the number of soldiers in each army, and a and b represent the number of enemy fighters that each soldier can kill per unit time. Now the equations

dA/dt = -bB
dB/dt = -aA
,

show us the rate at which the size of each army is changing at a given instant. And these give us

bB2aA2 = C,

where C is a constant.

This is immediately revealing. It shows that the strength of an army depends more on its bare size than on the sophistication of its weapons. In order to meet an army twice your size you’d need weapons (or fighting skills) that are four times as effective.

Simple as they are, these ideas shed light on the historic choices of leaders such as Nelson, who sought to divide his enemies into small groups, and Lanchester himself illustrated his point by referring to the British and German navies then at war. Today his ideas (and their descendants) inform the rules behind tabletop and computer wargames.

# Fitting

Asteroid 46610 was named Bésixdouze in homage to Antoine de Saint-Exupéry’s character Le Petit Prince, who lived on Asteroid B-612.

(Thanks, Dan.)

# The German Tank Problem

During World War II, as they mulled whether to attempt an invasion of the continent, the Allies needed to estimate the number of tanks Germany was producing. They asked their intelligence services to guess the number by spying on German factories and counting tanks on the battlefield, but these efforts produced contradictory estimates. Finally they resorted to statistical analysis.

They did this by studying the serial numbers on captured and destroyed German tanks. Suppose German tanks are numbered sequentially 1, 2, 3, …, B, where B is the total number of tanks that we seek to know. And suppose that we have five captured tanks whose serial numbers are 21, 35, 42, 60, and 89. It turns out that

$\displaystyle B = \frac{(N+1)M}{N} - 1,$

where N is the sample size (here, 5) and M is the highest sampled number (here, 89). In this example, the formula tells us that B = 105.8, so we’d estimate that 106 tanks had been produced at that time.

In the event, Allied statisticians reportedly estimated that the Germans had produced 246 tanks per month between June 1940 and September 1942. Intelligence estimates had put the total at about 1,400. When the Allies captured German production records after the war, they found that they had produced 245 tanks per month during those three years, almost precisely what the statisticians had predicted, and less than 20 percent of the intelligence estimate.

(Thanks, Ryan.)

# Living Large

In his 1984 book Scaling, Duke University physiologist Knut Schmidt-Nielsen points out a pleasing coincidence:

A 30-gram mouse that breathes at a rate of 150 times per minute will breathe about 200 million times during its 3-year life; a 5-ton elephant that breathes at the rate of 6 times per minute will take approximately the same number of breaths during its 40-year lifespan. The heart of the mouse, ticking away at 600 beats per minute, will give the mouse some 800 million heartbeats in its lifetime. The elephant, with its heart beating 30 times per minute, is awarded the same number of heartbeats during its life.

In fact, most mammals have roughly the same number of heartbeats per lifetime, about 109. Small mammals have high metabolic rates and short lives; large ones have low rates and long lives. Humans are lucky: “We live several times as long as our body size suggests we should.”

# Current Affairs

Is it possible to sail on a river on a windless day? In Why Cats Land on Their Feet (2012), Mark Levi points out that the answer is yes, at least in principle. If the keel is turned broadly against the current, then this will carry the boat downstream, drawing the sail through the still air. Now the roles of the sail and the keel are reversed: The keel catches the motion of the river, acting as a sail, and the boat follows the course established by the sail, which acts as a keel. “It’s just like regular sailing,” Levi writes, “except upside down.”

That’s from the point of view of an observer on shore. In the boat’s reference frame, the water is still and a wind is blowing upstream. From this perspective the boat is sailing conventionally — the sail is catching the wind and the keel slices through the water.

“This is a neat symmetry,” Levi notes. “The sail and the keel exchange roles, depending on your reference frame! So the sail and the keel have completely equal rights in that respect.”

The diagram above represents a road network where T is the number of travelers. The leg from START to A takes T/100 = t minutes to traverse, as does the leg from B to END. The legs from START to B and from A to END each take a constant 45 minutes.

Now suppose that 4000 drivers want to travel from START to END. The northern and the southern routes are equally efficient, so the drivers will split into two groups, and each will arrive at END in 2000/100 + 45 = 65 minutes.

But now suppose that planners, hoping to improve matters, add a shortcut between A and B with a travel time of 0 minutes. Now all the drivers will take the route from START to A, since in the worst case it will take 4000/100 = 40 minutes, rather than the guaranteed 45 minutes taken by the leg from START to B. From A every driver will take the shortcut to B, for the same reason: Even in the worst case, the trip from B to END is 5 minutes faster than the trip from A to END.

As a result, every driver’s trip now takes 4000/100 + 4000/100 = 80 minutes, which is 15 minutes longer than in the original state of affairs. No individual driver has an incentive to change his behavior, since now the two original routes (northern and southern) each take 4000/100 + 45 = 85 minutes. If the 4000 drivers as a body could agree never to use the shortcut, they’d all be better off. But without a way to enforce this, all are stuck with longer commutes.

The principle was discovered by German mathematician Dietrich Braess in 1968. It’s known as Braess’ paradox.

# Math Notes

The American Mathematical Monthly of January 1959 notes an “interesting Pythagorean triangle” discovered by Victor Thébault: If the two perpendicular sides of a right triangle measure 88209 and 90288, then the hypotenuse is 126225.

In other words, if you sum the squares of 88209 and its reverse, the result is a perfect square.

# A Prime Number Generator

Take the first n prime numbers, 2, 3, 5, …, pn, and divide them into two groups in any way whatever. Find the product of the numbers in each group, and call these A and B. (If one of the groups is empty, assign it the product 1.) No matter how the numbers are grouped, $A+B$ and $\left |A-B \right |$ will always turn out to be prime numbers, provided only that they’re less than $p_{n+1}^{2}$ (and greater than 1, of course). For example, here’s what we get for (2, 3, 5) (where $p_{n+1}^{2}$ = 72 = 49):

2 × 3 + 5 = 11
2 × 5 + 3 = 13
2 × 5 – 3 = 7
3 × 5 + 2 = 17
3 × 5 – 2 = 13
2 × 3 × 5 + 1 = 31
2 × 3 × 5 – 1 = 29

In More Mathematical Morsels (1991), Ross Honsberger writes, “For me, the fascination with this procedure seems to lie to a considerable extent in the amusement of watching it actually turn out prime numbers; I’m sure I only half believed it would work until I had seen it performed a few times.”

It makes sense if you think about it. Each of the first n prime numbers will divide either A or B but not the other, so it will fail to divide either $A+B$ or $\left |A-B \right |$. That means that any prime divisor of $A+B$ or $\left |A-B \right |$ must be at least as big as $p_{n+1}$, and if there were more than one of them, the number would amount to at least $p_{n+1}^{2}$, putting it outside the limit. So for $A+B$ or $\left |A-B \right |$ between 1 and $p_{n+1}^{2}$, it must itself be a prime number p such that pn+1p < $p_{n+1}^{2}$.

# Misc

• When written in all caps, the title of John Hiatt’s song “Have a Little Faith in Me” contains no curves.
• Tycho Brahe kept a tame elk.
• It isn’t known whether the sum of π and e is irrational.
• Abraham Lincoln, Andrew Johnson, Ulysses Grant, and James Garfield died without wills.
• “Selfishness is one of the qualities apt to inspire love.” — Nathaniel Hawthorne

The medieval Latin riddle In girum imus nocte et consumimur igni (“We enter the circle at night and are consumed by fire”) is a palindrome. The answer is “moths.”

# The Revelation Game

Is it rational to believe in the existence of a superior being? In 1982, New York University political scientist Steven J. Brams addressed the question using game theory. Assume that SB (the superior being) chooses whether to reveal himself, and P (a person) chooses whether to believe in SB’s existence. The two players have the following goals:

SB: Primary goal — wants P to believe in his existence. Secondary goal — prefers not to reveal himself.
P: Primary goal — wants belief (or nonbelief) in SB’s existence confirmed by evidence (or lack thereof). Secondary goal — prefers to believe in SB’s existence.

These goals determine the rankings of the four outcomes listed above. In each ordered pair, the first number refers to SB’s preference for that outcome (4 is high, 1 is low), and the second number refers to P’s preference. For example, SB prefers the two outcomes in which P believes in SB’s existence (because that’s his primary goal), and of these two outcomes, he prefers the one in which he doesn’t reveal himself (because that’s his secondary goal).

Brams finds a paradox here. If the game is one of complete information, then P knows that SB prefers not to reveal himself — that is, that SB prefers the second row to the first, regardless of P’s choice. And if SB will undoubtedly choose the second row, then P should choose his own preferred cell in that row, the second one. This makes (2, 3) the rational outcome of the game; it’s also the only outcome that neither player would choose unilaterally to depart once it’s chosen. And yet outcome (3, 4) would be preferred by both to (2, 3).

“Thus,” writes Brams, “not only is it rational for SB not to reveal himself and for P not to believe in his existence — a problem in itself for a theist if SB is God — but, more problematic for the rationalist, this outcome is unmistakably worse for both players than revelation by SB and belief by P, which would confirm P’s belief in SB’s existence.”

(Steven J. Brams, Superior Beings, 1983. This example is drawn largely from his paper “Belief in God: A Game-Theoretic Paradox,” in International Journal for Philosophy of Religion 13:3 [1982], 121-129.)