By Basil E. Gala, Ph.D.

In Search of Meaning

Birth is Nature throwing dice. Being born male or female is like tossing a coin: heads, you get a penis and testicles; tails, you get a vagina, womb, and ovaries, and after puberty, breasts. When you marry, you take a big chance that the person you think you love will turn out to be unsuitable for the long term. Death can strike you unexpectedly in youth or after maturity. Yes, death itself is certain, but what follows death is still unknown to us mortals. You pray to God to save our souls in heaven. People commonly attribute to God important events which they cannot control because they’re random. God blessed us with a son, or God gave us abundant crops. Damage from earthquakes, hurricanes, and tornadoes are still called “acts of God,” not covered by routine insurance. Yet, we have ways of dealing with uncertainty in rational ways to make the best of any situation no matter how hazardous. These ways are the techniques of probability and statistical theory, coupled with game theory. That is, we deal with uncertainty by being rational, rather than crossing fingers, avoiding black cats on the way, getting around ladders rather than under then, and pocketing that rabbit’s foot.

Some events don’t suffer from uncertainty; they’re not random. If you drop a stone, it goes down to the ground. The earth’s gravitation field is the cause and the stone hitting the dirt is the effect: a cause and effect relationship or causality. If I punch you in the face, you’re likely to punch me back; that’s not exactly causality, but close to it. You may decide not to hit back for a variety of reasons; perhaps, you are chicken, you may be much weaker than I, or you may be a pacifist.

Many of the phenomena we live with are deterministic or involve cause-effect relationships. We usually like things this way because we know what to expect. The sun will rise tomorrow morning, regularly like clockwork. We can plan our actions better in a world of regularities because we can forecast the future more accurately. We may be able to exercise control over the situation and get the results we’re after.

If you farm, you use an almanac to plan your planting and harvesting. The weather, which is often not regular and predictable, may throw a monkey wrench in the works, but still you can expect a wet winter and spring, a warm summer for the crops to mature and be ready for harvesting in the fall.

Democritus, a very early scientist, declared that the world is ruled by necessity and by chance. The ancient Greek gods had to obey necessity (Nature’s “laws”), but could escape chance with their super powers. What percentage of events happen deterministically and what percentage randomly? We also have phenomena which are mixtures of deterministic and random effects, such as the punch in the face. I cannot you give percentages, so I’ll say fifty-fifty. Some people point to the regularities in the universe to show that God exists who created this order. That’s the watchmaker argument. What does the chaos in the universe prove? Perhaps that Satan has also been active.

Naturally, we avoid chaos and Satan. Sailors prayed for winds to their destination and farmers for sufficient rainfall, but not too much of it. In societies with dominant males, parents pray for a boy to be born to them. Most of us long for a long and healthy life, but the end is always uncertain. The philosopher Solon told wealthy and happy King Croesus: “Midena pro tou telous makarize.” Do not praise anyone’s good fortune before the end. Croesus lost his wealth and kingdom before his end.

The end of the universe, some scientists forecast, will be complete chaos, an entropy death, when disorder is total, since the second law of thermodynamics shows that all systems tend towards greater disorder. Well, the law applies to closed systems, without inputs or outputs, which may not be the case for the universe.

Can we presume then that the universe was orderly when it began from a singularity? If the universe in the beginning was as small as a particle, such as an atom, was it not then subject to quantum fluctuations? These fluctuations may have given rise to regularities causing the universe to evolve into complex shapes and to generate living things, including intelligent humans. A lowering of disorder and entropy therefore does occur in the universe at least at times and in some regions. How do random particles generate large organized systems, such as our bodies, which in time fall apart into particles again?

Scientists have shown conclusively that particles, from photons to molecules, behave in random ways. The position, or existence, of a particle is a probabilistic wave; we cannot pinpoint the particle’s location exactly. Since the atmosphere is a huge particle soup, consequently the weather is quite unpredictable. Also, largely unpredictable is a national economy of human monads exchanging products and services. Humans are not particles but they often behave in random ways as particles do. You can never be sure how a human, or any animal, will react exactly.

Still, you can expect, with a very high probability, a human or animal to act in character. Character is the entire collection of an individual’s habits and instincts. When you are well informed about a person’s character, you can predict the response 95% of the time. As General Patton (George C. Scott) said in the film Patton: “Rommel, you magnificent bastard; I read your book.”

To the extent of established habits and instincts, a person’s acts are predetermined and free will is lacking.

Yet, although getting things done is difficult, what we desire and will strongly tends to become fact. In a changing world, with much randomness, getting things done is challenging. As I pointed out at the start, we can gain control and effect desired outcomes by using probability theory and statistics.

Until the nineteenth century probability theory was terra incognita even for the best minds. Much of the impetus for the development of the theory came from gambling. Also, the field received stimulation from the need to establish actuarial tables and develop fitting premiums for insurance, both life and casualty.

Death is a certain event for every person, just like taxes. What is random is the year of death. By collecting large amounts of data about many deaths, averages death ages can be estimated. Karl Friedrich Gauss, who worked for the German Government’s insurance office, developed the Gaussian curve for his work, a curve also known as the normal or bell-shaped curve.

In fact, a random variable which depends on many independent variables, twelve or more, has a Gaussian probability density. Such variables are the weight of persons or their intelligence. The production of a normal density is known in statistics as the law of large numbers.

The normal curve is well described by an exponential mathematical expression with two parameters: the mean of the density and its standard deviation, that is the square root of the average squared deviations of the observations from the mean. The Gaussian density is quite common in the sciences because many phenomena of interest to us are the result of many factors working together, as in the case of human intelligence.

What is of interest here is the question: Why do many factors at work result in a normal density, thus giving us a handle in dealing with randomness, for example in developing actuarial tables for insurance? Remember that the factors are random variables with scattering or fluctuations, but when brought together, the fluctuations cancel out and we end up with something predictable. It is so in a gas chamber where the molecules do a random walk but the gas behaves predictably or the particles making up our own bodies so our bodies have fixed positions instead of being subject to quantum mechanics and existing in many places at the same time.

What can you do with such results? Use the normal curve to place your bets in gambling or investments. Any one bet may be a loss or a gain, but in the long run you’ll get the average return. Casinos earn profits by betting on the average gain to the casino and the odds are fixed in the casino’s favor. A salesperson goes out to the field, seeing many prospects and making a pitch, knowing that after many presentations sales will follow.

Predators know the law of averages instinctively. Generally, a cheetah, world’s fastest animal, will catch the prey one time in ten chases. That one catch is sufficient to keep the cheetah fed.

In science and technology any measurement you may will have a random error, usually normally distributed. Make measurements of your object, then calculate the mean to get a number close to the correct one. It’s like tossing a coin. The more times you toss, the closer you get to the ratio of 0.5 for heads or tails over the number of tosses.

You’re now entering the mathematical theory of probability. The theory had its motivation in games of chance in the sixteenth century (Gerolamo Cardano) and in the seventeenth century (Blaise Pascal and de Fermat), with a formal presentation in the nineteenth century (Laplace). Andrey Kolmogorov established modern axiomatic probability theory with his 1933 “Foundations of the Theory of Probability.” The year I was born, 1933, after centuries of geometry, algebra, and calculus, we had mathematical probability.

Coincidentally, the twentieth century was when physicists such as Niels Bohr and Richard Feynman gave us quantum mechanics, describing the random behavior of particles.

The axioms of probability are simple. Random variables, continuous or discrete, exist in state space where each event is associated with a number between 0 and 1. The probability of an impossible event is 0 and that of a certain event is 1. We could equally well have assigned numbers from 0 to 100, which would better relate to percentages.

Another axiom is that the union of disjoint events has a probability which is the sum of probabilities of each event in the union.

How do we fix the probabilities to events? We can do it in two ways. We can observe the frequencies of occurrence of each event or we can estimate the probabilities using combinatorial analysis and certain assumptions, such as equally probable outcomes. For example, the probability that twos will appear in a dice throw (snake eyes) is 1/36 or close to 0.0277, assuming the dice is fair (not loaded) and well shaken in a cup.

Once probability theory and quantum mechanics appeared on the scene, Isaac Newton’s deterministic universe languished. Kismet was no longer fashionable. “The moving finger writes and having writ, moves on,” was relegated to poetry.

In 1927, Werner Heisenberg stated his uncertainty principle in physics. You cannot know both the position and the momentum of a particle exactly. That is because particles are also probabilistic waves.

When we attempt to predict outcomes to take fitting action for our benefit, we model the situation by simplifying the scene. Our usual models assume we have causal relationships in the events we observe. That is easier to handle but often incorrect. A degree of randomness always creeps into our field of view. For best results, take that randomness into account in your calculations.

Interesting games and contests are designed with randomness or unexpected events. Story tellers know this and include surprises in their plots. But you can estimate the probabilities or memorize frequency tables in games.

I supplemented my research fellowship as a graduate student by playing poker to win. I memorized the probabilities of various card hands. Moreover, I studied the behavior of other players. A bluff in poker is rarely successful but it introduces uncertainty in your competitors so that your strong hand will yield some profit when you get it.

Uncertainty holds more information in a message. Claude E. Shannon described this with his equation for the entropy of an information source in his 1948 “A Mathematical Theory of Communication.” Entropy is the weighted sum of the logarithm of the probabilities of the events, preceded by negative sign. The logarithm is taken to reduce the scale of the numbers less than 1, resulting in negative components. The negative sign in front corrects for the negative components, so we end up with a positive quantity for information or entropy.

Shannon’s formula clearly shows that the more uncertainty we have in a message the greater the information we are receiving. Compare the entropy in tossing a coin versus throwing a pair of dice. A uniform distribution of events has maximum entropy or uncertainty.

In investments, we commonly say, the higher risk corresponds to greater interest or gain. A very safe investment, such as a U.S. Treasury bill, currently yields a little over 2% in interest. Well, that is generally the case if one has no inside information or control. Does Warren Buffet invest in risky ventures? No. First, he and his team thoroughly investigate companies and appraise their value. Second, Buffet usually acquires sufficient shares in a company to exercise a measure of effective control in operations.

Earning money year after year in safe and thoroughly controlled investments is rather dull. Gambling in casinos is more exciting.

Games are exciting to the extent that outcomes are unpredictable. A contest stimulates the fans when the teams are matched and the winning team is undetermined until the very end of the football or baseball game. People gamble on the outcome of the game, with insiders always having an edge, as they do in the stock market.

Some people seek change and adventure, seeking their fortunes in foolish or calculated ventures.

Atoms and molecules with freedom from each other and their medium perform random walks, bouncing against each other and excited by their own internal wave nature. Sometimes they form crystals and grow together becoming firm and static. Crystals may become very hard like diamonds. Usually, there is no life in crystals.

The DNA, however, is a crystal in some respects, its molecules holding together to some extent in a helix. But it is also dynamic. The DNA and RNA, made with carbon chains, provide the basic programming for all living things on earth. Accidental changes to the structure of DNA, usually harmful to life, occasionally provide a springboard for evolution. It is so with breakthrough inventions of all sorts. Serendipity cannot be underestimated in a good life.

We can conclude that survival and success in life depends on both flexibility and uncertainty as much as on firmness and structure.

December, 2016

Vista, California