Sunday, May 24, 2015

208: Your Kids Are Smarter Than You

Audio Link

Before we start, I'd like to thank listeners Don-e Merson and Seasoncolor, who have posted some more nice reviews on iTunes.   Thanks guys!

Now, on to today's topic.   Did you know that, measured by constant standards, the average Intelligence Quotient, or IQ, of the world's population has been steadily increasing as long as it has been measured?  In fact, by today's standards, your great-grandparents most likely would be formally diagnosed as mentally retarded.  It's a little confusing, since the IQ tests are continually re-normalized, so the "average IQ" at  any given time is pegged to 100.   But if we look at the raw test scores and compare them across decades, we see that in every modern industrialized country, the IQ has slowly been creeping upwards.   This effect is known as the Flyyn Effect, named after the New Zealand psychiatrist who first noticed it in the 1980s.  This seems pretty surprising-- could our entire population really be steadily increasing its intelligence?  

When I first heard about this effect, I was a bit skeptical.     If you've read Stephen Jay Gould's classic "The Mismeasure of Man", you have learned about all sorts of broken and ridiculous ways in which people have attempted to measure intelligence at various times.   My favorite example was an IQ test from the early 20th century where your intelligence was, in part, dependent on your ability to recall the locations of certain Ivy League colleges.    Even though such egregious examples no longer are likely to appear, you could easily hypothesize that the Flynn Effect was merely measuring the fact that over the past century, kids have been progressively exposed to a lot more miscellaneous trivia first through radio, then TV, growing mass media, and finally on the Internet.   

Even simple things such as the expanding access to books and magazines throughout the 20th century might have contributed; I remember all the hours I spent biking between local used bookstores as a teenager, looking for cool math and science books, and I doubt my father had such an opportunity at his age.   My daughter won't even have to think about such absurdities, having instant access to virtually all major literature published by the human race over the Internet.   But it turns out that the belief that this IQ growth is just measuring access to accumulated factoids is not quite right-- the growth has been very minor in tests dependent on this type of factual knowledge, and is really measuring an increased ability to do abstract reasoning using simple concepts.

In our modern lives, we take the concept of abstraction for granted:  the ability to talk about and compare ideas, rather than just discuss concrete items and actions that are immediately relevant.   And of course all of modern mathematics, including topics we often discuss in this podcast, is dependent on the ability to do this kind of abstraction.   But this is not something to take for granted:  it has been slowly growing in our society from generation to generation.    For example, one of the online articles linked in the show notes talks about a study done on an isolated tribe in Liberia.   They took a bunch of random objects from the village and asked the villagers to sort them into categories.    Instead of sorting into groups of clothing, tools, and food, as we might do, they put items together that were used together, such as a potato with a knife, since the knife is used to cut the potato.     So apparently modern IQ tests are largely measuring our ability to think in abstract categories, and this is the ability that is increasing.   Flynn has argued that we should really label this kind of thinking as "more modern" rather than "more intelligent"-- can we really say objectively that one kind of thinking is better?   However, we probably can say that this modern thinking is a critical component in the explosion of science and technology that we observe in the modern world.

There are numerous theories to try to explain the Flynn Effect.   Most center on social or societal factors.   Perhaps the explosion of media exposure is important not because of miscellaneous factoids, but because of the generally more cognitively complex environment, forcing us to think in abstractions to make sense of the massive bombardment of ideas coming at us from literature, television, and the Internet.   The growth of intellectually demanding work, where more and more of us have jobs that involve at least some thinking rather than pure manual labor, may also contribute.     Another possible factor is the reduced family size in the Western world:  with fewer kids around, each gets more parental attention, and this may foster development of abstract thought.    And of course, in recent years, I'm sure there has been an IQ explosion among the very important subset of the population who listen to Math Mutation.

Aside from social factors, there are more basic physical ones:    basic improvements to health and welfare, such as massively reduced malnutrition and disease, could also be important here.     You may remember that back in podcast 110, "One Intestinal Worm Per Child", we discussed how simple health can have a much bigger effect on educational success than fancy computers.   There is also the theory that we are simply measuring the effects of Darwinian natural selection, where parents with this more modern thinking style are more likely to reproduce, due to coping better in our technological 20th-21st century society.     But most biologists believe that the Flynn effect has set upon us too quickly to be evolution-based.

To further complicate the discussion, some recent studies in Northern Europe seem to show that the Flynn Effect is disappearing or getting reversed.   It's unclear whether this is a real effect, or an artifact of recent population shifts:  over the past two decades, there has been massive immigration from the Third World into these countries, and it could be that we are just measuring the fact that a lot of new immigrants are just in earlier stages of the Flynn Effect treadmill.   But as in every generation, there is no shortage of commentators who can find good reasons why today's young whippersnappers are supposedly getting dumber, such as a focus on repetitive video games and social-network inanity.   We need to contrast this with  their parents' more intellecutal pursuits, such as Looney Tunes and Jerry Springer.

So, what does this all mean?    We certainly do see some effects in society that may very well be partially due to the Flynn Effect, such as the explosion of new technology in recent years.    I think we should do whatever we can to continue making our kids smarter, and enabling more modern and abstract thinking-- though of course, that would be true with or without the Flynn Effect anyway.    Encourage your kids to engage in cognitively complex tasks such as reading lots of books, learning to play a musical instrument, and discussing cool math podcasts.   But when they tell you in a few years that you're going senile, don't take it personally, you really are dumber than they are, due to the Flynn Effect.

And this has been your math mutation for today.









References:


Sunday, April 12, 2015

207: Answering All Possible Questions

Audio Link

Have you ever wished, in your daily life, that you had a simple way to find all the answers about any subject that was vexing you?    Perhaps you are in a personal crisis wondering whether God exists, or maybe have a mundane issue as simple has finding your way home when lost.   Well, according to 13th-century monk Ramon Llull, you're in luck.   Llull devised a unique philosophical system, based on combining a set of primitive concepts, that he believed would provide the path to solving any conceivable dilemma.   His primary goal was to find a way to discuss religious issues and rationally convert heathens to Christianity, without relying on unprovable statements from the Bible or other holy books.    As a philosophy, his system was far from definitive or complete, and gradually faded into obscurity.   But along the way he became a major contributor to mathematics, making advances in areas as diverse as algebra, combinatorics, and computer science as he tried to elaborate upon his strange philosophical methods.

Llull began by listing a set of nine attributes in each of several categories of thought, intended to represent a complete description of that category, which could be agreed upon both by Christians and non-Christians.   For example, his first list was the nine attributes of God:   goodness, greatness, eternity, power, wisdom, will, virtue, truth, and glory.    He wanted to discuss all combinations of these virtues, but repeating them endlessly was kind of tedious in the days before word processing, so he labeled each with a letter:  B, C, D, E, F, G, H, I, K.    He then drew a diagram in which he connected each letter to each of the others, forming kind of a nine-pointed star with fully connected vertices; you can see a picture at one of the links in the show notes.    By examining a particular connection, you could spur a discussion of the relationship of two attributes of God:  for example, by observing the connection between B and C, you could discuss how God's goodness is great, and how his greatness is good.    Whatever you might think of his religious views, this was actually a major advance in algebra:   while the basics of algebra had existed by then, variables were commonly represented by short words rather than letters, and had been thought of as simply representing an unknown to be solved for in a single equation.    For the first time, Llull was using letters to represent something more complex than numbers, and mixing and matching them in arbitrary expressions.    In addition, his diagram of the relations between attributes was what we now call a graph, an important basic data structure in computer science.   He also created another depiction of the possible combinations as a square half-matrix, another data structure that is common today but was unknown in Llull's time.

Llull's system got even more complicated when he introduced additional sets of attributes, and tried to find more combinations.     For example, another set of his concepts consisted of relationships:  difference, concordance, contrariety, beginning, middle, end, majority, equality, minority.   He also had a list of subjects:   God, angel, heaven, man, imaginative, sensitive, vegetative, elementative, instrumentative.   Even deeper philosophical conversations could theoretically result from combining elements from several lists.   This created some challenges, however.   He would again label each element of these lists with letters, but keeping track of all combinations led to an explosion of possibilities:  just the three lists we have so far make 9x9x9, or 729 combinations, and he had a total of 6 major lists.   So to facilitate discussion of arbitrary combinations, he created a set of three nested wheels, each divided into 9 sectors, one for each letter.   One would be drawn on a sheet of paper, and the other two would be progressively smaller and drawn on separate sheets that could be placed over the first one and independently rotated.    Thus, he had developed a kind of primitive machine for elaborating the combinations of multiple sets:  for each 9 turns of one wheel, you would turn the next larger wheel once, and by the time you returned to your starting point, you would have explored all the combinations possible on the three wheels.    Several centuries later, the great mathematician Gottfried Leibniz cited Llull as a major influence when inventing the first mechanical calculating machines.

There were also several other contributions resulting from this work, which you can read about in more detail at the links in the show notes:   Llull can be thought of as the first person to discuss ternary relations, or functions of more than one variable; and he anticipated some of Condorcet's contributions to election theory, which we discussed back in podcast 183.  Llull, of course, was not really concerned with making contributions to mathematics, as he was concentrating on developing a comprehensive philosophical system.   In his own mind, at least, he believed that he had succeeded:   he claimed that "everything that exists is implied, and there is nothing that exists outside it".   To help prove this point, he wrote a long treatise elaborating upon physical, conceptual, geometrical, cosmological, and social applications of his ideas.     Apparently he even spent five pages showing how his system could aid the captain of a ship that was lost at sea.    Personally, I would prefer to have a GPS.   But even if our modern thought processes don't strictly follow Llull's guidelines, we still owe him a debt of gratitude for his contributions to mathematics along the way.

And this has been your math mutation for today.

References:

Sunday, March 22, 2015

206: Deceptive Digits

Audio Link

Imagine that you are a crooked corporate manager, and are trying to convince your large financial firm's customers that they own a set of continually growing stocks, when in fact you blew the whole thing investing in math podcasts over a decade ago. You carefully create artifical monthly statements indicating made-up balances and profits, choosing numbers where each digit 1-9 appears as the leading digit about 1/9th of the time, so everything looks random just like real balances would. You are then shocked when the cops come and arrest you, telling you that the distribution of these leading digits is a key piece of evidence. In fact, due to a bizarre but accurate mathematical rule known as Benford's Law, the first digit should have been 1 about 30% of the time, with probabilities trailing off until 9s only appear about 5% of the time. How could this be? Could the random processes of reality actually favor some digits over others?

This surprising mathematical law was first discovered by American astronomer Simon Newcomb back in 1881, in a pre-automation era when performing advanced computations efficiently required a small book listing tables of logarithms. Newcomb noticed that in his logarithm book, the earlier pages, which covered numbers starting with 1, were much more worn than later ones. In 1938, physicist Frank Benford investigated this in more detail, which is why he got to put his name on the law. He looked at thousands of data sets as diverse as the surface areas of rivers, a large set of molecular weights, 104 physical constants, and all the numbers he could gather from an issue of Reader's Digest. He found the results remarkably consistent: a 1 would be the leading digit about 30% of the time, followed by 2 at about 18%, and gradually trailing down to about 5% each for 8 and 9.

While counterintuitive at first, Benford's Law actually makes a lot of sense if you look at a piece of logarithmic graph paper. You probably saw this kind of paper in high school physics class: it has a large interval between 1 and 2, with shrinking intervals as you get up to 9, and then the interval grows again to represent the beginning of the next order of magnitude. The idea is that this scale can represent values that may be very small and very large on the same graph, by having the same amount of space on a graph represent much larger intervals as the order of magnitude grows. It effectively transforms exponential intervals to linear ones. If you can generate a data set that tends to vary evenly across orders of magnitude, it is likely to generate numbers which appear at random locations on this log scale-- which means that the probabilities of it being in a 1-2 interval are much larger than a 2-3, 3-4, and so on.

Now, you are probably thinking of the next logical quesiton, why would a data set vary smoothly across several orders of magnitude? Actually, there are some very natural ways this could happen. One way is if you are choosing a bunch of totally arbitrary numbers generated from diverse sources, as in the Reader's Digest example, or the set of assorted physical constants. Another simple explanation is exponential growth. Take a look, for example, at the powers of 2: 2, 4, 8, 16, 32, 64, 128, etc. You can see that for each count of digits in the number, you only go through a few values before jumping to having more digits, or the next order of magnitude.  When you add new digits by doubling values, you will jump up to a larger number that begins with a 1.   If you try writing out the first 20 or so powers of 2 and look at the first digits, you will see that we are already not too far off from Benford's Law, with 1s appearing most commonly in the lead.

Sets of arbitrarily occurring human or natural data that can span multiple orders of magnitude also tend to share this Benford distribution. The key is that you need to choose a data set that does have this kind of span, due to encompassing both very small and very large examples. If you look at populations of towns in England, ranging from the tiniest hovel to London, you will see that it obeys Benford's law. However, if you define "small town" as a town with 100-999 residents, creating a category that is restricted to three-digit numbers only, this phenomenon will go away, and the leading digits will likely show a roughly equal distribution.

The most intriguing part of Benford's law is the fact that it leads to several powerful real-life applications. As we alluded to in the intro to this topic, Benford's Law is legally admissible in cases of accounting fraud, and can often be used to ensnare foolish fraudsters who haven't had the foresight to listen to Math Mutation. (Or who are listening too slowly and haven't reached this episode yet.) A link in the show notes goes to an article that demonstrates fraud in several bankrupt U.S. municipalities based on their reported data not conforming to Benford's law. It was claimed that this law proves fraud in Iran's 2009 election data as well, and in the economic data Greece used to enter the Eurozone. It has also been proposed that this could be a good test for detecting scientific fraud in published papers. Naturally, however, once someone knows about Benford's law they can use it to generate their fake data, so complicance with this law doesn't prove the absence of fraud.

So, next time you are looking at a large data set in an accounting table, scientific article, or newspaper story, take a close look at the first digits of all the numbers. If you don't see the digits appearing in the proportions identified by Benford, you may very well be seeing a set of made-up numbers.

And this has been your math mutation for today.

 


References:

Sunday, March 1, 2015

205: The Converse of a CEO

Audio Link

Ever since I was a small child, I aspired to grow up to become a great Rectangle.    When I was only six years old, my father took me to meet one of the leading Rectangles of New Jersey, and I will always remember his advice:  "Be sure to have four sides and four angles."   All through my teenage years, I worked on developing my four sides and four angles, as I read similar advice in numerous glossy magazines aimed at Rectangle fans.     In high school, my guidance counselor showed me many nice pamphlets with profiles of famous Rectangles who had ridden their four sides and four angles to success.   Finally, soon after I turned 18, I took a shot at realizing my dream, lining up many hours to audition for a spot on the popular TV show "American Rectangle".    But when I made it up onto the stage, I was mortified to be met by a chorus of laughter, and end up as one of the foolish dorks that Simon Cowell makes fun of on the failed auditions episode.    With all my years of effort, I had not become a Rectangle, but a mere Trapezoid.

OK, that anecdote might be slightly absurd, but think for a moment about the premise.   Suppose you want to become successful in  some difficult profession or task.   A natural inclination is to find others who have succeeded at that, and ask them for advice.   If you find something that a large proportion of those successful people claim to have done, then you conclude that following those actions will lead you to success.     Most of us don't actually aspire to become geometric shapes, but you can probably think of many miscellaneous pieces of advice you have heard in this area:   practicing many hours, waking up early every day, choosing an appropriate college major, etc.    I started reflecting on this concept after looking at a nice career planning tool aimed at high school students, which lets them select professions they are interested in, and then read about attributes and advice from those successful in it.

Unfortunately, this kind of advice-seeking from the successful is actually acting out a basic mathematical fallacy.    In simple logic terms, an implication statement "A implies B", is logically different from its converse, "B implies A".   Neither statement logically follows from the other:   "A implies B" does not mean that "B implies A".   When we look at the case of rectangles, this seems fairly easy to understand:   the condition A of having four sides and four angles does NOT imply the consequent B, that the object is a rectangle.   By observing that all rectangles have these characteristics, we are learning the opposite:   Being a rectangle implies that the object has four sides and four angles.   This is important to recognize because there many be infinitely many non-rectangle objects that meet this condition, and actual rectangles might represent only a small portion of the possibilities.     If we wanted to isolate conditions that will imply something is a rectangle, we need to look at both rectangles and non-rectangles, to identify unique rectangle conditions, such as having four right angles.    Once we have a set of properties that will pertain only to rectangles and not to non-rectangles, then we might be able to come up with an intelligent set of preconditions.

Sadly, real life does not always offer us geometric shapes.   When we substitute a real aspiration people might have, too many try to infer the keys to success just from looking at the successful.      Without thinking through this basic logical fallacy about a statement and its converse, "A implies B" does not mean "B implies A",  many people waste lots of time and money following paths where their likelihood of success is minimal.     A common case among today's generation of middle class kids is the hopeful young writer who decides to major in English.   An aspiring writer might see that many successful writers have degrees in English, without taking the time to note that the proportion of English majors who become successful writers is infinitesmally small.    The statement "If you are a successful writer today, you probably have a college degree in English" does not imply "if you earn a degree in English, you will probably become a successful writer."       In contrast,  if looking at computer engineering, they might see a similar profile among the most successful-- but will also find that unlike in English, a huge majority of computer engineering majors do end up with a well-paying job in that field upon graduation.    So in that case, the implication really does work both ways-- but this is a coincidence, since the statement and its converse are independent.

Even famous business consultants are subject to this fallacy.   Have you heard of  the influential 1980s business book "In Search of Excellence", where the authors closely looked at a set of successful companies to find out what characteristics they were built upon?      That became one of the all-time best-selling business books, and many leaders followed their sweeping conclusions, hoping to someday make their companies as successful as NCR, Wang, or Data General.     But some have criticized the basic premise of this research for this same basic flaw:  trying to determine the conditions of success by looking only at the successful will inherently get you the wrong kind of implication.   It may enable you to find a set of preconditions that being successful means you must have had, while these same preconditions are met by endless numbers of failed companies.   You really need to study both success and failure to find conditions that uniquely imply success.

So, when you or your children are thinking about their future, look carefully at all the available information, not just at instances of success.   Always keep in mind that a logical statement "A implies B" is truly distinct from its converse "B implies A", and take this into account in your decision making.
And this has been your math mutation for today.




References:


Sunday, January 18, 2015

204: What Happened To Grigori Perelman?

Audio Link

Before we start, I'd like to thank listeners katenmkate and EdB, who recently posted nice reviews on iTunes. I'd also like to welcome our many new listeners-- from the hits on the Facebook page, I'm guessing a bunch of you out there just got new smartphones for Xmas and started listening to podcasts. Remember, posting good reviews on iTunes helps spread the word about Math Mutation, as well as motivating me to get to work on the next episode.

Anyway, on to today's topic. We often think of mathematical history as something that happened far in the past, rather than something that is still going on. This is understandable to some degree, as until you get to the most advanced level of college math classes, you generally are learning about discoveries and theorems proven centuries ago. But even since this podcast began in 2007, the mathematical world has not stood still. In particular, way back in episode 12, we discussed the strange case of Grigori Perelman, the Russian genius who had refused the Fields Medal, widely viewed as math's equivalent of the Nobel Prize. Perelman is still alive, and his saga has just continued to get more bizarre.

As you may recall, Grigori Perelman was the first person to solve one of the Clay Institute's celebrated "Millennium Problems", a set of major problems identified by leading mathematicians in the year 2000 as key challenges for the 21st century. Just two years later, Perelman posted a series of internet articles containing a proof of the Poincare Conjecture, a millennium problem involving the shapes of certain multidimensional spaces. But because he had posted it on the internet instead of in a refereed journal, there was some confusion about when or how he would qualify for the prize. And amid this controversy, a group of Chinese mathematicians published a journal article claiming they had completed the proof, apparently claiming credit for themselves for solving this problem. The confusion was compounded by the fact that so few mathematicians in the world could fully understand the proof to begin with. Apparently all this bickering left a bitter taste in Perelman's mouth, and even though he was selected to receive the Fields Medal, he refused it, quit professional mathematics altogether, and moved back to Russia to quietly live with his mother.

That was pretty much where things stood at the time we discussed Perelman in podcast 12. My curiosity about his fate was revived a few months ago when I read Masha Gessen's excellent biography of Perlman, "Perfect Rigor: A Genius and the Mathematical Breakthrough of the Century". It gives a great overview of Perelman's early life, where he became a superstar in Russian math competitions but still had to contend with Soviet anti-semitism when moving on to university level. It also continues a little beyond the events of 2006, describing a somewhat happy postscript: eventually the competing group of Chinese mathematicians retitled their paper " Hamilton–Perelman's Proof of the PoincarĂ© Conjecture and the Geometrization Conjecture", explicitly removing any attempt to claim credit for the proof, and recasting their contribution as merely providing a more readable explanation of Perelman's proof. Sadly, this did not cause Perelman to rejoin the mathematical community: he has continued to live in poverty and seclusion with his mother, remaining retired from mathematics and refusing any kind of interviews with the media.

As you would expect, this reclusiveness just served to pique the curiosity of the world media, and there were many attempts to get him to give interviews or return to public life. Even when researching her biography, Masha Gessen was unable to get an interview. In 2010, the Clay institute finally decided to officially award him the million dollar prize for solving the Poincare Conjecture There had been some concern that his refusal to publish in a traditional journal would disqualify him for the prize, but the Institute seemed willing to modify the rules in this case. Still, Perelman refused to accept the prize or rejoin the mathematical community. He claimed that this was partially because he thought Richard Hamilton, another mathematician whose work he had built upon for the proof, was just as deserving as he was. He also said that "the main reason is my disagreement with the organized mathematical community. I don't like their decisions, I consider them unjust." Responding to a persistent reporter through the closed door of his apartment, he later clarified that he didn't want "to be on display like an animal in a zoo." Even more paradoxically, he added "I'm not a hero of mathematics. I'm not even that successful." Perhaps he just holds himself and everyone else to impossibly high standards.

Meanwhile, Perelman's elusiveness to the media has continued. In 2011 a Russian studio filmed a documentary about him, again without cooperation or participation from Perelman himself. A Russian journalist named Alexander Zabrovsky claimed later that year to have successfully interviewed Perelman and published a report, but experienced analysts, including biographer Masha Gessen, poked that report full of holes, pointing out various unlikely statements and contradictions. One critic provided the amusing summary "All those thoughts about nanotechnologies and the ideas of filling hollowness look like rabbi's thoughts about pork flavor properties." A more believable 2012 article by journalist Brett Forrest describes a brief, and rather unenlightening, conversation he was able to have with Perelman after staking out his apartment for several days and finally catching him while the mathematician and his mother were out for a walk.

Probably the most intriguing possibility here is that Perelman has not actually abandoned mathematics, but has merely abandoned the organized research community, and is using his seclusion to quietly work on the problems that truly interest him. Fellow mathematician Yakov Eliashberg claimed in 2007 that Perelman had privately confided that he was working on some new problems, but did not yet have any results worth reporting. Meanwhile, Perelman continues to ignore the world around him, as he and his mother quietly live in their small apartment in St Petersburg, Russia. Something tells me that this not quite the end of the Perelman story, or of his contributions to mathematics.

And this has been your math mutation for today.

 

References:
      







Saturday, December 27, 2014

203: Big Numbers Upside Down

Audio Link

When it comes to understanding big numbers, our universe just isn't very cooperative.  Of course, this statement depends a bit on your definition of the word "big".   The age of the universe is a barely noticeable 14 billion years, or 1.4 times 10 to the 10th power.   The radius of the observable universe is estimated as 46 billion light years, around 4.6 * 10 to the 25th power meters.  The observable universe is estimated to contain a number of atoms equal to about 10 to the 80th power, or a 1 followed by 80 zeroes.   Now you might say that some of these numbers are pretty big, by your judgement.   But still, these seem pretty pathetic to me, with none of their exponents even containing exponents.   It's fairly easy to write down a number that's larger than any of these without much effort, and we have discussed such numbers in several previous podcasts.  While it's easy to come up with mathematical definitions of numbers much larger than these, is there some way we can relate even larger numbers to physical realities?   Internet author Robert Munafo has a great web page up, linked in the show notes, with all kinds of examples of significant large numbers.
   
There are some borderline examples of large numbers that result from various forms of games and amusements.   For example, the number of possible chess games is estimated as 10 to the 10 to the 50th power.   Similarly, if playing the "four 4s" game on a calculator, trying to get the largest number you can with four 4s, you can reach 10 to the (8 times 10 to the 153rd power) equal to 4 to the 4 to the 4 to the 4th power.  It can be argued, however, that numbers that result from games, artifical exercises created by humans for their amusement, really should not count as physical numbers.   These might more accurately be considered another form of mathematical construct.
   
At a more physical level, some scientists have come up with some pretty wild sounding numbers based on assumptions about what goes on in the multiverse. beyond what humans could directly observe, even in theory.   These are extremely speculative, of course, and largely border on science fiction, though based at some level in modern physics.  For example, one estimate is that there are likely 10 to the 10 to the 82nd power universes existing in our multiverse, though this calculation varies widely depending on initial assumptions.   In an even stranger calculation, physicist Max Tegmark has estimated that if the universe is infinite and random, then there is likely another identical copy of our observable universe within 10 to the 10 to the 115th meters.   Munafo's page contains many more examples of such estimates from physics.
   
My favorite class of these large "physical" numbers is the use of probabilities, as discussed by Richard Crandall in his classic 1997 Scientific American article (linked in the show notes).   There are many things that can physically happen whose infinitesimal odds dwarf the numbers involved in any physical measurement we can make of the universe.   Naturally, due to their infinitesimal probabilities, these things are almost certain never to actually happen, so some might argue that they are just as theoretical as artificial mathematical constructions.  But I still find them a bit more satisfying.  For example, a parrot would have odds of about a 1 in 10 to the 3 millionth of pecking out a classic Sherlock Holmes novel, if placed in front of a typewriter for a year.   Taking on an even more unlikely event, what is the probability that a full beer can on a flat, motionless table will suddenly flip on its side due to random quantum fluctuations sometime in the next year?  Crandall estimates this as 1 in 10 to the 10 to the 33rd.   In the same neighborhood is the chance of a mouse surviving a week on the surface of the sun, due to random fluctuations that locally create a comfortable temperature and atmosphere:  1 in 10 to the 10 to the 42nd power.  Similarly, your odds of suddently being randomly and spontaneously teleported to Mars are 10 to the 10 to the 51st power to 1.   Sorry, Edgar Rice Burroughs.
   
So, it looks like tiny probabilities might be the best way to envision the vastness of truly large numbers, and escape from the limitations of our universe's puny 10 to the 80th power number of atoms.  If you aren't spontaneously teleported to Mars, maybe you can think of even more cool examples of large numbers involved in tiny probabilities that apply to our physical world.
   
And this has been your Math Mutation for today.




References:

Sunday, November 23, 2014

202: Psychochronometry

Audio Link

Before we start, I'd like to thank listener Stefan Novak, who made a donation to Operation Gratitude in honor of Math Mutation.  Remember, you can get your name mentioned too, by donating to your favorite charity and sending me an email about it!

Now, on to today's topic.  I recently celebrated my 45th birthday.  It seems like the years are zipping by now-- it feels like just yesterday when I was learning to podcast, and my 3rd grader was the baby in the cover photo.   This actually ties in well with the fact that I've recently been reading "Thinking in Numbers", the latest book by Daniel Tammett.   You may recall the Tammett, who I've featured in several previous episodes, is known as the "Rosetta Stone" of autistic savants, as he combines the Rain Man-like mathematical talents with the social skills to live a relatively normal life, and write accessible popular books on how his mind works.    This latest book is actually a collection of loosely autobiographical essays about various mathematical topics.   One I found especially interesting was the discussion of how our perceptions of time change as we age.
     
I think most of us believe that when we were young, time just seemed longer.   The 365 days between one birthday and the next were an inconceivably vast stretch of time when you were 9 or 10, while at the age of 45, it does not seem nearly as long.   Tammett points out that there is a pretty simple way to explain this using mathematics:  when you are younger, any given amount of time simply represents a much larger proportion of your life.   When you are 10, the next year you experience is equal to 10% of your previous life, which is a pretty large chunk.   At my age, the next year will only be 1/45th of my life,  or about 2.2%, which is much less noticeable.   So it stands to reason that as we get older, each year will prove less and less significant.   This observation did not actually originate with Tammett-- it was first pointed out by 19th century philsopher Paul Janet, a professor at the Sorbonne in France.
   
Following up on the topic, I found a nice article online by an author named James Kenney, which I have linked in the show notes.  He mentions that there is a term for this analysis of why time seems to pass by at different rates, "Psychochronometry".   Extending the concept of time being experienced proportionally, he points out that we should think of years like a musical scale:  in music, every time we move up one octave in pitch, we are doubling the frequency.   Similarly, we should think of our lives as divided into "octaves", with each octave being perceived as roughly the equivalent subjective time as the previous one.   So the times from ages 1 to 2, 2 to 4, 4 to 8, 8 to 16, 16 to 32, and 32 to 64, are each an octave, experienced as roughly equivalent to the average human.
   
This outlook is a bit on the bleak side though:  it makes me uneasy to reflect on the fact that, barring any truly extraordinary medical advances in the next decade or two, I'm already well into the second-to-last octave of my life.  Am I really speeding down a highway to old age with my foot stuck on the accelerator, and time zipping by faster and faster?   Is there anything I can do to make it feel like I have more time left?   Fortunately, a little research on the web reveals that there are other theories of the passage of time, which offer a little more hope.
   
In particular, I like the "perceptual theory", the idea that our perception of time is in proportion to the amount of new things we have perceived during a time interval.  When you are a child, nearly everything is new, and you are constantly learning about the world.   As we reach adulthood, we tend to settle down and get into routines, and learning or experiencing something truly new becomes increasingly rare.   Under this theory, the lack of new experiences is what makes time go by too quickly.  And this means there *is* something you can do about it-- if you feel like things are getting repetitive, try to arrange your life so that you continue to have new experiences.
   
There are many common ways to address this problem:  travel, change your job,  get married, have a child, or strive for the pinnacle of human achievement and start a podcast.  If time or money are short, there are also simple ways to add new experiences without major changes in your life.  My strong interest in imaginary and virtual worlds has been an endless source of mirth to my wife.  I attend a weekly Dungeons and Dragons game, avidly follow the Fables graphic novels, exercise by jogging through random cities in Wii Streets U, and love exploring electronic realms within video games like Skyrim or Assassins Creed.  You may argue that the unreality of these worlds makes them less of an "experience" than other things I could be doing-- but I think it's hard to dispute the fact that these do add moments to my life that are fundamentally different from my day-to-day routine.   One might argue that a better way to gain new experiences is to spend more time travelling and go to real places, but personally I would sacrifice 100 years of life if it meant I would never have to deal with airport security again, or have to spend 6 hours scrunched into an airplane seat designed for dwarven contortionists.
   
So, will my varied virtual experiences lengthen my perceived life, or am I ultmately doomed by Janet's math?   Find me in 50 years, and maybe I'll have a good answer.  Or maybe not-- time will be passing too quickly by then for me to pay attention to silly questions.
   
And this has been your math mutation for today.
     
References: