Friday, December 13, 2019

Science, Causality & Genes


The inference of causality is rampant these days. Nowadays any whisper of correlation is a causal event, ask the epidemiologists. But causality is more akin to “if this then that” (IFTTT). Using an oft-repeated term, a quid pro quo. A reaction to an action. Newtonian more than Quantum. No fuzzy behavior as with the electron but more of an antimatter-matter relationship. A good old deterministic...Poof.



Defining causality is easy. A fact that delivers a verifiable consequence each time it is invoked. Flicking a switch turns the light on, that is causal. Pushing a button turns the motor on, again causal. But as you might extrapolate from the examples above, these are mechanical things. The mechanics and electrical circuitry are designed for a specific action or IFTTT. 



But now we haul off to the biological world and contemplate where the triggers are for causal effects? Where indeed? Perhaps the nearest example of such an event is a one-gene-one disease. Examples include; cystic fibrosis, sickle cell disease, Fragile X syndrome, muscular dystrophy, or Huntington disease.  



The concept of Gene therapy lies in replacing a faulty gene or adds a new gene in an attempt to cure disease or improve your body's ability to fight disease. 

These may be associated with Dominant gene mutations which are vertically transmitted from parent to child as in Huntington’s Disease or via a Recessive gene where it may skip generations. If you are interested in more look here: https://www.nature.com/scitable/topicpage/rare-genetic-disorders-learning-about-genetic-disease-979/ and here for the genetic therapies utilized thus far http://www.genetherapynet.com/JoomlaTest2/index.php?option=com_content&view=article&id=164:diseases-treated-with-gene-therapy-&catid=97:patient-information&Itemid=14 and yet more to come with the CRSPR technology that hopes to revolutionize healthcare in cancer management, cardiovascular and a whole host of other metabolic-related diseases; https://www.labiotech.eu/tops/crispr-technology-cure-disease/

The stumbling block in this hubris of “causality” suggests, we are still a long way away from that mountain we need to climb. Cancer, for instance, is a multigene-epigenetic-mesenchymal corrupted system of influence and a single silver bullet is not going to save the day. Adding to the complexity is the catacombs of multiple crosslinked-cross-talking pathways beneath the cell surface that invoke their own potential when blocked by small molecule antibodies. That cautionary statement must give us pause to reflect upon what to do? Work is underway to tease out the trees in the forest tying a yellow ribbon on each one as we delve into the forest in search of a cure.

That then brings up the casual semantics of the statisticians into the open. Where the science outside of the laboratories, is being marked down in percentages to the eyes of the beholders? Karl Pierson would have been happy to see his methods lauded and enriched with more analysis than has been done in any real science in the last 10 years so to speak. But disheartened at its abuse. With all the Regressions, T-tests, ANOVAs and MANOVAs sprinkled through most of the medical literature, a kernel of truth seems barely enough to ruffle the pure white undisturbed surface of the “scientific,” statistical landscape. The true question being; does “X” cause “Y?” The answer seems wanting in most of the medical literature that pervades the burgeoning load of journals proliferating faster than I can think, let alone write these words.

Relying on “p-value” SIGNIFICANCE https://jedismedicine.blogspot.com/2013/06/significance.html?spref=tw is fast becoming an albatross around the author’s neck as more and more non-statisticians begin to see the light thru the dark art of this field of doctored mathematics. I hear, recently someone proclaims that perhaps “we should not use the term “significant” with a p-value of 0.05. Indeed! But then the suggesting individual goes on to say, “we should perhaps use; less significant, fairly significant or most significant.” Tapering one’s bias into terminology through statistical manipulation will add another layer of opacity to an already opaque world in disguise.

Already the stark scientific world in cancer care has changed parameters for the Partial Response or PR when dealing with the drug effect upon cancer from >50% reduction in the size of the tumor to =30% as the definition (of a PR). And then in an obvious collapse of the norm, they proposed PFS (Progression Free Survival) as a surrogate to Overall Survival (OS). Read here: Progression Free Survival (PFS) https://jedismedicine.blogspot.com/2012/05/progression-free-survival-pfs.html?spref=tw The former can be massaged by using longer interim analysis to have a longer PFS. However, as many a study has shown that increasing PFS has NOTHING to do with OS in a large majority of cases. So why the change? It seems that utilizing the new surrogates leads to a host of quick, but unverifiable results rather than the tedious and methodical means of extracting the truest information. Today, once you get your p-value based on the selected variables and other flimsy outcomes, the publishers-a-plenty roll their rollers on their massive machines. And once the headline has been headlined and the Social Media has triumphed the unvalidated contrived result, who cares about what is on the 28 paged small-windowed retraction. Indeed, who cares because everyone seems to have attention deficit disorder these days. From the “scientists,” to the “marketer,” to the “consumer,” in one fell swoop. Yet even in these days of quarterly earnings, the long-lost art of tedium would still bear lasting fruit to the annual bonus of a manager and his or her longevity at the helm.

The reason I bring the issues of Gene therapy and statistics in the same paragraph is quite simply to arm the disarmed. Using gene therapy has its complications; unintended consequences to be sure. Deep Learning through Neural Network deployment to harness the “Gene Speak” between overexpressors (promoters) and (suppressors) is a difficult communication to decipher, let alone the Epigenetic chatter and its influence and the pleadings of the interstitial content with its Epithelial-Mesenchymal transitions. Read here: https://techxplore.com/news/2019-12-deep-gene-interactions.html Additionally the viral vectors used to transfer the gene into the DNA of an individual can have serious consequences. In regard to CRISPR editing, there are whole host issues that also need to be resolved before wholesale mainlining into science. Although a new study from Johns Hopkins seems to help the cause of the CRSPR technology, it is the first step and only used in mice. The scientists used a targeted gene epigenome editing approach in the developing mouse brain, Johns Hopkins Medicine researchers reversed one gene mutation that leads to the genetic disorder WAGR syndrome, which causes intellectual disability and obesity in people. This specific editing was unique in that it changed the epigenome—how the genes are regulated—without changing the actual genetic code of the gene being regulated. Read here: https://medicalxpress.com/news/2019-12-genetic-brain-disorder-mice-precision.html . One thing that we all must be cognizant about is that using the arbitrary probability methods of statistics would not be prudent in determining success. Failure, however, will be obvious to all. Failure may be due to a lack of incorporation of the genetic material in the genome, by antiviral antibodies against the viral vectors or other errors of methodology. Computational cataloging will help expedite the process as it has done in other sectors of human activity, as reams of data are processed to determine the causality in a disease. We will progress as we do in fits and starts. But the progress we do make, will only be through careful, methodical and tedious science.

I clamor because I am concerned. I write because I care, and I learn because I want to propel the young mind to greater heights than it thinks it can achieve. So, keep the faith in the tedium of real science and stay on the journey towards excellence.



Bon Voyage!

Sunday, December 8, 2019

THE HUMAN FACTOR


The drumbeat for Artificial Intelligence is deafening. But when you look at the skinny, it all boils down to IFTTT. Doesn’t it? There is also this big overhang called the Big Data that seems to propose that the entire world’s knowledge is contained in a server somewhere from where the computational binary codes have determined the essence of life and death. Not so, it seems when you really look at it hard enough. And then you realize AI is about making a machine do the IFTTT with a select few variables. And that the machine does well. Let me give you an example that everyone will tout as their aha moment and immediately subjugate their humanness to the unfailing virtues of the AI. But hold on a bit, friend, hold on!



Can we translate the meaning of the human essence into a machine? Nope, not by any standard. Ah, but they say, in time, in time and soon. To that I am compelled to say, not so fast young Turk, not so fast. 



To give credit to the function of a machine working through the AI code of conduct, one needs to look at the Aircraft Autopilot for example. The entire premise of the Autopilot is based on four variables; Lift, Weight, Thrust and Drag. Compute those variables and all three X, Y, and Z-axis are easily manageable. At takeoff, the thrust outweighs both the parasitic drag and the weight drag on the aircraft. At Cruise, the lift created by the wings equals the induced drag and the weight of the aircraft. At descent, the thrust is diminished to allow the other mentioned forces to act accordingly and bring the aircraft transitioning to the landing phase. But what happens if a sudden force of moderate turbulence is applied to the aircraft? The Autopilot automatically disconnects, so the human can take over to override impact from the sudden change. Any sudden change in the wind parameters would raise the specter of a “Ding, Ding” followed by “wind shear” if the aircraft was so equipped with such a hardwired coded algorithm. So, if there are only a few, finite, known and calculated variables, the machine language codes well and delivers with flying colors. Add an additional insult to the quiet world of the AI and, then, you know who is the boss.  Autopilots need human managers who are fully prepped with nuances of machine shortcomings and failures. Sometimes, however, we the humans can take the autopilot’s beseeching warnings and hit the ocean or earth, with a ferocity of momentum and wonder, what gives? Humans too are fallible. That’s how we learn.



Remember HAL9000 from the 2001 Space Odyssey? We haven’t yet achieved full subservience to the red dot. Perhaps the example of the autonomous land vehicle crashing into pedestrians and fences itself is a reminder that machine-learned machines need to stay functional in the machine-machine world. Humans are unpredictable and therefore their drive for the unexpected is wrought with calamities. Touting the success of the Autopilot in the airplane and the boat and the cruise control in the car and comparing that to the wide world of a multitude of variables is hubris in a nutshell. The wide world has many rules from the Quantum mechanics to the Newtonian Laws to the Flows of Fluid and Thermal dynamics, to the continually in the motion of the living breathing planet, its aching tectonics, its atmospheric lifts, and its restless oceans. To punch in all those variables into the artificiality of a machine and expect it to perform with the agility and grace of multitasking, parallel processing human brain is asking for, way too much.



Reversing course from the machine to the human again, there are some who believe that AI will replace humans in healthcare? Perhaps the best we will get will be predictive analytics with percentages and potentials, but not what is needed to help save the human from a malfunctioning code in their genes. Even delving into reading X-Rays and diagnosing disease from the images is based on the IFTTT. Feed-in a million X-Ray images and code them with the disease accordingly and then let the AI figure out the diagnosis of the presented image. The missing context that humans rely upon is well, missing. The “mass” in the lung could be benign or malignant, it could be infectious or a fungal ball, it could be scar tissue or rampant virus, it could be fluid or the beginning of something really malicious. Context matters. Someone would say we can code for that as well. Ok then how about the overall state of the human being, one asks? We can code for that as well. Ok, how about human feelings? We can do that as well, by coding their facial characteristics? Ok, how about their fears and the subsurface emotions coursing their veins? Even humans can’t figure that out? Actually, with a little communication, we can. But one gets the message that the many existing variables that prevent the false negatives and the false positives from embracing the predictions by subjugating themselves to the zero is nigh impossible. But the trials and the beat will go on. We might succeed one day, but not today.



There is something to be said about the human factor. It might be messy, it might be chaotic, it might be fractured, it might also be grand as well or it might be the cause of a lot of human follies and the start of a learning process to make better that which is flawed and that, as they say, is the greatest of gifts contained within the 3-pound flesh solidly protected by its osseous protective confines. From the powered flight of the Wright Brothers to the behemoth million-ton flying machines controlled by a few levers upfront is a long way in hundred-plus years. The pace of progress accelerates as mundane tasks are relegated to the machines with the IFTTT and the human brain conceives ideas for the future! We live on ideas. We grow with ideas. We are the ideas that make the world go around.
“Second star to the right and straight on ‘til morning!’”

Steady as she goes!