“If the butterfly flaps its wings…” so goes the chaos theory of dynamical systems where the sensitive and not extreme dependence on the initial condition determines the disorder of the system as a whole.
So one might then consider that living in an extremely complex world that is governed by multivariate conditions, transposing each condition onto a theoretically based model would requires zillion terabytes to compute - which we don’t have yet, creates a problem for such enterprises. So herein lies the premise of this undertaking that only the brilliant mind of Edward Lorenz could have named the “Butterfly Effect.”
Each piece of the whole exerts influence on the whole in ways that we cannot completely comprehend. Our understanding continues to evolve and the Eureka moments, encompassing as they may be, are also expositions of the limitations of the dynamical system that we do not fully comprehend. Nothing is in isolation and nothing changes without some effect from something else. For instance the human population explosion over the past century is associated with consequences, such as pressures for survival with corresponding extinction of some species as well as artificial preservation of others for our own needs (horses and dogs for instance). So to say we live in a dynamic system is an understatement of our own existence. It is a dynamic system that is constantly in flux. A plastic mold constantly being remolded.
Progress for humanity is littered with the refuse of failure. It seems to mimic life where 1.5 billion species have gone extinct based on fossil data. Our failures maybe not as dramatic as that but they are indicative of our limits.
Our failure is based on limits of understanding, the initial and subsequent conditions that are the prerequisite in a dynamic system and the perceived outcome. There lies the tail of this tale.
Mathematical Modeling:
To understand nature humans have used the art of reductio ad absurdum. To reduce something to its most basic elements so that we can see how that something ticks. This philosophy has given birth to multiple industries that have used nature as a template. Aerospace, Medical, Engineering and Nuclear to name a few. Having mastered the art of seeing things flayed open for all pieces to be visible has also inculcated in us the art of modeling that would help newer design predictable function, performance and even the future.
Mathematical Modeling is a full-blown tool used in all sciences and disciplines. Do we understand it is the question?
Ah! there is the rub. Mathematical Modeling is a philosophical construct that has with time become the process du jour of scientific thought. It has been used successfully many times and many times it has come short. As much as the former is true, it is the latter that gives us pause. From our failures comes the light of knowledge. This then is the fruit derived from dashed hopes that leads to advancing the language of mathematical perfection. Something we seek but may never realize.
The fundamental question is whether it is possible to predict response of some process after various numerical inputs have been used to determine the outcome with a high degree of reliability.
The first order here is the initial thought also called “the concept.” The concept is based on several inputs that include materials, mechanics, interacting parts, procedures etc. This minimalistic list is left to the “judgment and experience of the mathematical analysts.” Any increase in the number of variables such as interacting and changing external dynamic factors requires a complex series of undertakings that involve probabilities and mathematics steps in.
So mathematical models vary by what the desired study is, variables involved, materials used, and the intensity of the computational analysis. For instance a model based on the aeronautical dynamics of a composite material will require information on the deformation patterns, tensile strength and elasticity of the material under study.
All such comparisons would be based on aluminum (the gold standard) that has been in use for decades. Additional information regarding its tolerance to fatigue (recall the Aloha Airlines disaster) and its resistance to fire (Use of Nomex fibers in composites and as this relates to the strength or weakness of the material) will be brought to bear on the new concept to be exploited.
Carbon fiber Composite material
After the initial conceptualization process has been completed, the concept then has to be validated by the rigor of predicting component/subcomponent and/or system failures. This is done via simulated experiments. At this point all subcomponents, components, subassemblies and the total assembly of the product under consideration are forced through known “stressors.” A successful prediction based on inducing failures under different external variables gives validation to the concept. The errors that creep in could relate to the initial conceptualized model, numerical approximation (recall Edward Lorenz’s error), the simulated experiment, or the statistical variables employed. Since most of the modeling is based on predictable axioms, they are axioms nevertheless and as such suffer the wrath of nature’s whims.
Remembering that a mathematical model is an order of reference within the hierarchy of a larger model, must give the analyst pause and concern at every level, for instance an analyst performing a validation experiment of a rigid machine exposed to large degrees of vibration has to have “Harmonic analysis” in mind as one of the parameters. An example here could be the rotor blade subjected to the supersonic speeds has to undergo deformation, linear and rotational stresses and place significant centrifugal forces at the rotor disc level. Fatigue damage in homogenous material is especially dangerous because they are unpredictable, giving no prior notification of the imminent failure, they occur suddenly
and show no exterior plastic deformations. Advanced composite materials exhibit gradual damage accumulation to failure. Typically, matrix cracking, delamination occur early in the rotor life, while fiber-fracture and fiber-matrix debonds initiate during the beginning of the life and accumulate rapidly towards the end, leading to the final failure. Thus actual experimentation shows that composite rotor blades become better predictors of failure then homogenous ones. This was borne out of actual experimentation. The mathematical modeling did not reveal the anomalous behavior of different material used in the simulated outcome. In a calm wind environment this would have different dynamics compared in conditions of strong wind shear where performance, function and integrity may be challenged. After several years of research and development Boeing has designed the 787 aircraft with 50% composite material for its light weight to improve efficiency and overall fuselage strength.
A mathematical model estimating the population by using the “Birth Rate” and “Death rate” does not really tell us the real story. Changing the parameter to “Per Capita Birth Rate,” and “Per Capita Death Rate” signifies the population being tested thus it gives a better record. Using the right parameter in the methodology makes a difference in the computed outcome and desired result.
History teaches us that the Malthusian Model of population expansion and resource comparison was woefully inadequate since there were no inputs for catastrophes. It was a steady-state model, which as we know is not how humans and nature behave and neither does it incorporate externalities like the Influenza epidemic of 1918 that decimated a quarter of the world population. His modeling was simplistic in the steady-state growth of the population outstripping the food supply proclaiming the incipient crises.
Adding a single variable can yield an unexpected result. View the Spring loaded pendulum or a double pendulum effects on the chaotic movement of the pendulum itself as calculated in the Wolfram Mathematica Program.
Adding a single variable can yield an unexpected result. View the Spring loaded pendulum or a double pendulum effects on the chaotic movement of the pendulum itself as calculated in the Wolfram Mathematica Program.
Spring loaded Pendulum
So what is this mathematical modeling genie and how do we use it and benefit from it?
Mathematical Modeling streamlines some processes:
Conceptualizes a thought.
Thoughts are placed through a simulated rigor.
The thought is subjected to various externalities.
Modifications of the model can test different scenarios on the same hypothesis.
The problems that are inherent in Mathematical Modeling include:
The concept may not be proven
The concept is insensitive to the variables
It is too simple in its characteristics.
It is too complex requiring unavailable computing power.
The model cannot lend itself to a mathematical solution.
Uninspiring as it may be there are small order magnitude variables that are rarely incorporated in the validation process and these include:
Symmetry or x=y then y=x and
Transivity or if x=y and y=z then x=z
Thus only using Reflexivity of x=x limits the useful load on the experiment.
Validation Processes include the content, context and criterion: Content refers to the materials and information available, Context is in reference to the externalities and criterion is based on specific environment use. All three validations have to be undertaken in the simulated experimental model to achieve a high degree of confidence of its reliability. After all the predictability of the validation process is based on repeatability. The Monte Carlo model of repeat computer simulations to draw inference of the viability, repeatability and validation of the outcome of the experiment is another mathematical model used in various scenarios.
Mathematical Model foundationally is a philosophical and logical construct. Although philosophy is a nuance of an understanding and logic is based on the variable parametric value, both constructs are highly subject to perceptions, understanding, resource limits and bias. Thus the Logic of Failure can be a failure of logic.
Logic of Failure:
A model that did not consider the strange attractors:
The Challenger Disaster: On January 28th in 1986 after a cold spell in Florida the Challenger lifted off to a highly publicized event.
Unbeknownst to the occupants in the Space Shuttle Cockpit was that the O-Rings or the “Toric joints” had become brittle due to exposure below the glass transition temperatures of 40 degrees Fahrenheit overnight. They failed 73 seconds into flight. The failure of the O-Ring resulted in high temperature gasses expulsion onto the external fuel tank leading to a catastrophic “System Anomaly!”
Unbeknownst to the occupants in the Space Shuttle Cockpit was that the O-Rings or the “Toric joints” had become brittle due to exposure below the glass transition temperatures of 40 degrees Fahrenheit overnight. They failed 73 seconds into flight. The failure of the O-Ring resulted in high temperature gasses expulsion onto the external fuel tank leading to a catastrophic “System Anomaly!”
The Risk analysis done prior to the Challenger flight and based on previous such space flight was calculated at 1 in 100,000. But later analysis showed it was realistically at 1 in 100. Given that the Challenger mission was the tenth mission and no previous failure data was available to asses the real risk of failure which actually would have been 1 in 10 (Since this was the tenth such flight). The O-Ring issue had not been previously addressed but had been in consideration for the Launch parameters. These parameters were set at 50 degrees Fahrenheit. The overnight lower temperatures however were not considered into the equation.
The noted Richard Feynman made the dramatic announcement of the etiology of the failure before a nation in mourning.
Richard Feynman (Nobel Laureate)
The noted Richard Feynman made the dramatic announcement of the etiology of the failure before a nation in mourning.
H1N1 Epidemic: In the Spring of 2009 the CDC (Center for Disease Control made a bold and provocative declaration that the estimated numbers of the H1N1 cases in the country were “upwards of 100,000.
At that point and time only 7415 had been confirmed. These lofty projections were based on a mathematical modeling done by two supercomputers. Both computers fed similar data drew the same conclusion. The “Rubbish in rubbish out” motto is self-explanatory. The model was projected on the basis of an exponential rise in cases based on previously drawn comparators of the Influenza Epidemic of 1918. The World Health Organization predicted dire outcomes of a worldwide pandemic and issued orders for quarantine of suspected individuals. Fortunately that proved to be untrue but in the process the world panicked. Conjecture and axioms have a way of loading up and ganging on the unsuspecting scientists and lay people alike. Airline traffic declined hotels lost booking, conferences were cancelled with unanticipated huge economic losses. Those forced to travel wore unwieldy masks to protect themselves furthering fear. This was not a case of “red-face” or “black-eye.” It was a debacle promulgated by a scientific folly.
H1N1 virus
At that point and time only 7415 had been confirmed. These lofty projections were based on a mathematical modeling done by two supercomputers. Both computers fed similar data drew the same conclusion. The “Rubbish in rubbish out” motto is self-explanatory. The model was projected on the basis of an exponential rise in cases based on previously drawn comparators of the Influenza Epidemic of 1918. The World Health Organization predicted dire outcomes of a worldwide pandemic and issued orders for quarantine of suspected individuals. Fortunately that proved to be untrue but in the process the world panicked. Conjecture and axioms have a way of loading up and ganging on the unsuspecting scientists and lay people alike. Airline traffic declined hotels lost booking, conferences were cancelled with unanticipated huge economic losses. Those forced to travel wore unwieldy masks to protect themselves furthering fear. This was not a case of “red-face” or “black-eye.” It was a debacle promulgated by a scientific folly.
HIV Progression: Two high profile science papers in the prestigious journal Nature were published in 1995 related to HIV and AIDS. During the media buzz of the coming Armageddon (HIV) stated large population decimation as a result of this infectious virus. Human existence was at stake. The premise was to treat all patients infected with HIV even if they were asymptomatic. The two studies by Drs David Ho and George Shaw called the Ho/Shaw predictive model used the “viral load and the quantitative value of the CD4 T cells (Immune Cells) as their parameters for their therapeutic assertions. The rise of the viral load was one parameter and the fall of the CD4 cells was the other. There was a reciprocal relationship between the two and thus the model predicted that in order to prevent an asymptomatic carrier from full blown AIDS they should be “hit hard and hit early” with a protease inhibitor. There was no Control Group used for either of the studies done separately by the doctors. Large numbers of patients were treated using the model. Eventually data gleaned from these treatments showed that the initial drop in the viral load was associated with a rise in CD4 cells but over a prolonged period of close observation the reverse happened and the viral load increased. So there was no validation of the conceptual design that had not been rendered through a properly designed scientific study with a control group to determine real efficacy. The unintended consequence of this undertaking of treating asymptomatic patients was that it created in short order multiple mutations in the virus that became resistant to the protease inhibitor. Now those with the disease would not get help. After many years and unnecessary treatment this treatment method was abandoned.
The message in this modeling was based on only two parameters without a control group. A preexisting opinion (bias) became a forced finding of fact. The model generated with the two parameters, in isolation of the true knowledge of the various players of immunity, was at best self-delusional and at worse unnecessarily forced therapy upon countless asymptomatic patients with resultant viral mutation showing resistance to the drug in use.
The Failure of Logic:
A model that considered strange attractors:
Economic Black Swan Effect: One can trace the history of the 2007-2008 Market meltdown and the fall of Wall Street giants like Lehman Brothers,
Bear Sterns and
Washington Mutual, to two Nobel prize winners Robert Merton and Myron Scholes with Fischer Black propounded the theory of the Black-Scholes Option-pricing model. The model assumed that the price of heavily traded assets follow a geometric Brownian motion with constant drift and limited volatility. In relation to stock option, the model incorporates the constant price variation of the stock, the time value of money, the option's strike price and the time to the option's expiry. The Black-Scholes option theory became the darling of Wall Street. It created derivatives and the CBOE (Chicago Board of Option Exchange) enjoyed a tremendous rise in contracts and therefore revenues. In 1973 the Exchange traded 911 contracts and in 2007 the volume of contracts reached one trillion.
Bear Sterns and
Washington Mutual, to two Nobel prize winners Robert Merton and Myron Scholes with Fischer Black propounded the theory of the Black-Scholes Option-pricing model. The model assumed that the price of heavily traded assets follow a geometric Brownian motion with constant drift and limited volatility. In relation to stock option, the model incorporates the constant price variation of the stock, the time value of money, the option's strike price and the time to the option's expiry. The Black-Scholes option theory became the darling of Wall Street. It created derivatives and the CBOE (Chicago Board of Option Exchange) enjoyed a tremendous rise in contracts and therefore revenues. In 1973 the Exchange traded 911 contracts and in 2007 the volume of contracts reached one trillion.
Messers. Scholes and Merton started the now infamous LTCM (Long Term Capital Management) Hedge fund The fund was based on absolute-return trading strategies that included fixed income arbitrage using convergence trades exploiting asymmetries of the US, Japanese and European government bond prices, combined with high leverage. The latter strategy exposed the fund management to a 25 to 1 debt to equity ratio in its later stages. The meteoric monetary gains of 40% in 12 months using this exploitation lasted until the East Asian Financial Crises of 1997 followed by the Salomon Brothers withdrawal from arbitrage and the Russian Financial Crises in 1998 sealed the insolvency of the mathematically modeled LTCM Hedge Fund. Investors ran from the Japanese and European Bonds to the US Treasuries for safety and in so doing erased and reversed the asymmetries against the proposed models used by the LTCM managers leading to huge losses in less than 4 months.
Keeping things in context, the Black-Scholes model was a simplified model using laws of thermodynamics to equate to the changes in the price of shares. The gentle trending above and below the mean was the premise of the stock movement rather than the violent seesaw effect that happens in the real world. The concept of slow movements of share prices in short term trading strategies was given credence to within the model rather then the real world events that can whipsaw and rubberneck in a trending market. The computers with buying and selling software programs that can hit the market at record speed can create havoc with the share prices and are predetermined by programs created by the quantitative financial analysts or quants. The market is a living, breathing mechanism based on the desires expressed singly by individuals or in large packets by the computing devices and governed ultimately by humans. This was not parameter used in the Black-Scholes model.
In keeping with the economic modeling methods the quants then decided to use the derivatives market to develop products like the CDOs or Collateralized Debt Obligation notes, the CDSs or Credit Default Swaps bundled securities.
The concept was simple, take a large number of mortgages of different risks, package them together and securitize them. The large bundled mortgages would be rated by an agency that was paid by the CDO creators and everyone would be content; the investor, comforted by the rating, the loan originator by the diversity of the portfolio and the seller by the pricing power of the product. The mortgage payments were allocated to the triple “A” investors first and then the rest. The equity investor in cases of default was left holding the useless IOU. These large caches of securitized mortgages made the banks and lending agencies hustle more lending to individuals that they would not have touched in prior years due to financial risk, now were welcomed. These were also the people who could not afford to pay the mortgages or held any assets as collateral to the loans.
The rapid increase of CDOs
The concept was simple, take a large number of mortgages of different risks, package them together and securitize them. The large bundled mortgages would be rated by an agency that was paid by the CDO creators and everyone would be content; the investor, comforted by the rating, the loan originator by the diversity of the portfolio and the seller by the pricing power of the product. The mortgage payments were allocated to the triple “A” investors first and then the rest. The equity investor in cases of default was left holding the useless IOU. These large caches of securitized mortgages made the banks and lending agencies hustle more lending to individuals that they would not have touched in prior years due to financial risk, now were welcomed. These were also the people who could not afford to pay the mortgages or held any assets as collateral to the loans.
In their paper published in August 2010 The Failure of Models that Predict Failure: Distance, Incentives and Defaults, Authors: Uday Rajan, Amit Seru, Vikrant Vig, state,
“As the level of securitization increases, lenders have an incentive to originate loans that rate high based on characteristics that are reported to investors, even if other unreported variables imply a lower borrower quality…To illustrate this effect, we show that a statistical default model estimated in a low securitization period breaks down in a high securitization period in a systematic manner: it under-predicts defaults among borrowers for whom soft information is more valuable. Regulations that rely on such models to assess default risk may therefore be undermined by the actions of market participants”
Bank Leverages
The banks and Mortgage lending agencies, looking for a quick profit and unaware individuals looking for a picket fenced home, all participated in the “perfect storm.” The credit default market took advantage of desire and at one point was estimated at $55 trillion!
Using mathematical modeling has its benefits of dealing with an idea, however in a complex world where the variables are in the millions if not billions the chance of creating a model to satisfy all potential adversities is zero. In economics especially, real life and mathematic modeling are decoupled. Benoit Mandelbrot calculated that IF the Dow Jones Average followed a normal distribution it would have moved 3.4% on 58 days between 1916 and 2003. In fact it moved 1001 times. He also calculated that should have moved 4.5% on six days however in reality it moved 366 times. It should have moved 7% once in 300,000 years. The reality is in the 20th century it moved 48 times. In all the market had 25-standard-deviations several days in a row. The market sometimes moves in the extreme tail of its normal distribution curve. A perfect example of the tail wagging the dog!
So are Mathematical Models cursed to eventually fail? The simple answer is no. They work well in design details of the components, assemblage and when environs are taken into account. However when large-scale “real world” scenarios are attempted without relevant information, the asymmetries that remain unaccounted for in the modeling concept cannot be validated until the “real world” tests the product in its own environment.
No comments:
Post a Comment