By David Debenham
“We do not see things as they are, we see things as we are.” 
Traditional doctrine treats fraud victims as simply those with weak internal controls, just as we treat victims of theft as those who fail to lock their doors and install burglar alarms. In my view the truth is far more complex.
There are fraud victims who are simply told lies, and who, as soon as they uncover the lies, they are mentally equipped to start a lawsuit. They invested in the fraud based on a rational calculation, and when they find out the information they relied on was false, they rationally determine who is at fault and sue for compensation, report the matter to the police, or complain to the relevant professional body. That, however, does not encompass the entire universe of fraud victims. Many victims believe lie after lie and invest not only their entire life savings, but beg, borrow and steal from relatives, or invest family members’ money until vast fortunes are lost in what appears to an avalanche of throwing good money after bad. It is this second group of fraud victims that concerns us here.
Most limitation statutes begin the period in which tort victim must sue. When that person knew, or ought to have known with the use of reasonable diligence, that a tort had caused them damage. The usual exception is when the tort victim is a vulnerable person, such as a child, or person with a mental infirmity, the limitation period only starts when they regain their senses or reach the age of majority. ` The fact is the matter many cases the victim of fraud does suffer from a form of mental infirmity or delusion.
Fraudsters often create a “reality distortion field” of “fake news” that lures in precisely those persons most susceptible to their deceptions. To understand this, we have to go back to the first principles of cognition.
First, there is an objective reality, and the reality we perceive. When we perceive we literally are of two minds:
- The logical mind, which, subject to errors due to logical fallacies, is what we use to form an analysis that leads to a logical conclusion. This takes effort, and time, and is susceptible to changed conclusion based on additional data. This our scientific brain at work.
- The emotional or intuitive mind often operates at the level of the unconscious and allows us to make a myriad of innocuous decisions without taking the time and trouble of going through a rational analysis. We bought A instead of B simply because we liked A more, often for reasons that are obscure even to us. At work are a series of unconscious lens or biases that assist in making “irrational” decisions quickly so we can move on with life without being slowed down by the sheer volume of decisions we have to make in life.
Mr. Spock of Star Trek fame is the shining example of the rational “Vulcan” who represses his “emotional” side to be the optimal “science” officer. Counter-balancing him is Dr. McCoy, the simple country doctor who sees the “human” (emotional) side of every dilemma. Captain Kirk then makes the final decision based on the submissions of Spock and McCoy. So, we would want to believe that we all balance our logical and emotional decision-making minds in perfect harmony. We know better.
Big decisions are often made quickly, and on irrational grounds. “Love at first sight”. I just “love” the white Toyota Corolla but I wouldn’t be caught dead in the red Toyota Corolla even though the white one costs $5000 more and it would only take $250 to paint the red one white. We often make important decisions without rational analysis. “Go with your gut”, “it just feels right”, or “let your instincts guide you” rather than be trapped in “analysis paralysis” caused by reasoning to a solution. If the rational analysis later “kicks in” we say “buyer’s remorse” or “acted in haste, now repent at leisure”. Fraudsters often prey on this by playing on our unconscious biases, including irrational trust cues (“you can trust me, we are both lifelong Blue Jays fans”), and then put on time limits to commit the fraud victim to a rush to judgment that excludes rational reflection. Once committed, the fraud victim can only reverse their decision by doing the hardest act known to our species—-admit they made a mistake.
Instead, what happens is the fraudster avoids reflection and buyer’s remorse by repeating and emphasizing misinformation that play on unconscious biases. “Go with your gut”. “Ley your intuition be your guide”. “Come on, you know it feels right”. “Don’t listen to your sister, she’s just jealous because you got the jump on her for a change”. Rather than engage in a painful analysis that would lead to the conclusion you made a mistake; your emotional self irrationally emphasizes positive data and rejects negative data in order to remain comfortable with the original decision. Remember the old car commercials that had the beautiful woman lying on the hood—they were as much about making male purchasers of the car irrationally feel good about their purchase than they were about buying the car in the first place— either way, they were never about a rational analysis to buy one car over another.
Let us begin with machine learning as a model for our brain. For sake of arguments, we start with a blank slate at birth then we are bombarded with a host of stimuli that the brain has to sort into 1) relevant and irrelevant, and 2) prioritize relevant stimuli in order of importance so that we can “judge” the situation and “act” appropriately. With babies this is a continued process of trial and error where we learn to “trust” our caregiver’s guidance in determining the relevance and importance of the sensory data inundating our senses. Objective reality is flooding our brains with data and our parents and siblings are providing us with the guidance as trusted advisors how to organize these data into useful organization or paradigm. Data overload in the form of unfamiliar reems of unfamiliar data causes stress leading to crying followed by parental reassurance and guidance.
“Experiences” is thus a combination of our own primary perception of the word (data) and our secondary perception of how to filter this data of useful information based only on our own experiences but those secondary experiences from those we learn to trust from guidance.
As we grow up, we learn to trust friends, teachers, and relatives’ judgments and how to filter data, resulting in tension between our own existing filters and new filters which we have to reconcile with our personal judgments. On an empirical level, our filters are used to predict results and by striving to minimize the “predictions and errors” we improve our filters through which we “see” reality. Teenage angst is caused by attempting to reconcile old filters we have traditionally accepted from our parents and our new filters we accept from our peers and the need to reconcile the tension between the two. Eventually “freeze” our filters when they achieve “unacceptable level of success” with only minor changes when new data is not explained or understood by the existing paradigm, we are using to explain the world around us and to act and guide our actions accordingly. Think of a magic trick. The magicians slight of hand makes it appear that a rabbit was pulled out of the hat. That, however, contradicts our experience we look for an alternate framework to explain this “prediction error” regarding the origin of the rabbit. The more the search in vain for that better explanation, the better the trick. But why do we enjoy the trick— because it proves there is more to life that reason and logic— no one wants to be a Vulcan.
“You see, gentlemen, reason is an excellent thing, there’s no disputing that, but reason is nothing but reason and satisfies only the rational side of man’s nature, while will is a manifestation of the whole life, that is, of the whole human life including reason and all the impulses”
So where do things go wrong? The “twitch reflex”, or master controller of our brain that makes the initial decision to go with our reflexes or analytical processes, is an emotional one that uses our biases to make immediate decisions when we feel pressed to act. The fraudster therefore creates an “emergency” for the fraud victim to act reflexively rather than reflectively.
Having passed this gate, the fraudster then plays upon our “trust” reactors. Everyone intuitively trust different people for different reasons. Trust, Trust — whether in a person or a product — is a compilation of data biased by “positive” or “negative” emotive experiences. It is that data squeezed through some individual emotional filter, active in every encounter. Are affinity frauds the result of happy group experiences throughout your childhood? Do you fall for charmers because they remind you of your father and how he made you feel? When that filter blinds you to danger — when it nudges you again and again to put your faith in a fraudster despite a dearth of data, your individual trust filter betrays you. To another, the same stimuli may make a person “hate” the fraudster precising because his charm reminds the potential mark of a charming father who ran off and abandon his family.
So, what went wrong? A problem at the perception level is often associated with mental illness. A problem at the “filter” level is associated with the “trust imbalance”. Presume we are not all together in life or there’s another motivation we are likely to trust someone we ought not to of. If there is a perceived imbalance between our perceived self worth and our actual lot in life that suggests we deserve more, we may decide that our personal paradigm is the problem. In order to get the life we deserve, we then fasten-on cult ideology or scheme that promises us our just desserts. All we have to do is shrug off the shackles of our existing belief systems (filters) and see the world in a new way be advocated by someone we are being asked to trust to come and act accordingly. Like Morpheus giving Neo the red pill in the movie the Matrix, we are shown that the world we believed was real is illusory and that once we see the real world as exposed by the red pill, our path to happiness becomes real. The fraudster performs two tasks simultaneously, he or she encourages trust by playing on our unconscious biases while simultaneously discouraging our reliance on our reflective, logical thought processes. Those who revert to logical reflection see the fraudster’s trick for what it is and reject it out of hand. Those whose unconscious biases are triggered are blind to the trick and “fall” for it. What follows is a trust in someone that overwhelms our previous trust sources and now becomes a new depository of our trust and with that creation of a new filter that rejects what we previously believed to be true and real as either false or incomplete.
In the case of fraud, the fraudster’s “red pill” or “black box” promises a world of happiness and contentment so long one keeps faith with the new paradigm being advocated by the fraudster. Examining the paradigm carefully would reflect a disappointing lack of trust in the fraudster which would result in being cast out of the group of new believers and forfeiting the benefits being promised as part of the fraudulent scheme.
As the new filter results in evidence to the contrary being rejected as irrelevant or misleading those who propose these other filters are rejected as untrustworthy sources of data. The believers reinforce their common belief to the new paradigm “filter” out data uncomfortable data. The true believers “will power” is tested and the new paradigm defended against those “naysayers” trying to test the fraud victim’s resolve with “false news”. The so- called echo chamber of social media groups do more than repeat beliefs, they actually reinforce them as like minded “true believers” reproduced the emphasized trustworthiness of the original source by amplifying and enhancing and refining the new paradigm. In such a way, a “community” becomes splintered into a group of competing and rival communities based on competing paradigms. It is only when the trustworthiness of the new paradigm is irrefutably demonstrated (usually by the victims having run out of money without any happy result) that the scales fall from the fraud victims’ eyes the paradigm failed for them and they are ready to trust other sources of information as trustworthy by engaging in conscious reflection on irrefutable data. Usually, the victims ask themselves how they could have been so stupid or greedy, when they should ask themselves what unconscious biases made them so susceptible to the “snake oil” being sold to them. It is only when the fraud victim “comes to” and engages their logical brain and they come to the realization of having been duped that the limitation period for them to sue should begin to run. Before that the unconscious mind has control of the “true believer” in a form of stupor akin to a hypnotic trance that prevents the victim from acting rationally.
Exemplar: Professional Bias
“The study of law is something unfamiliar to you… unlike any other schooling you’ve had before… I trained your minds. You come in here with a skull of a mush; you leave thinking like a lawyer.” 
Athletes develop their “twitch” muscles to improve their quickness or reflexes. We do the same mentally with various educational systems, training and experiences that allow us to “intuitively” know the right answer without going through the long, laborious rational analysis that a novice would require. This requires us to build a mental framework or “paradigm” that provides shortcuts to the right answer. The first stage in this process is to add someone we trust for the right answer— the instructor. The instructor provides you with a new way of looking at problems, you either trust the instructor and adopt his or her worldview, or you fail the course. Paradigm change or a change in one’s world view results in the change in the person or people you trust, at first to expand that group and eventually substitute the new group for the past one.
In the movie the “Paper Chase”, one student, Brooks, had a photographic memory so he could remember all of the “facts” of a particular case, but this was useless to him as a prospective lawyer because of his inability to adapt the filter of the law proposed by this law professor that says which “facts” are a) legally admissible, b) legally relevant and material, and c) legally dispositive. Without the ability to absorb these new filters, whether consciously or through osmosis, Brooks failed out of law school. The main character, Hart, on the other hand works hard to learn these legal filters only to reject them at the end of the movie because of the delayed realization of the distasteful person it caused him to become. The point is that adoption of a professional worldview is critical to professional success, and that worldview comes with conscious codes of conducts and unconscious biases. So, for example, lawyers are trained to blind themselves to the flaws of their clients in order to represent them, which often results in lawyers becoming the dupes for their clients’ illegal enterprises. Police officers’ unconscious biases can oft en cause them to blindly “rush to judgment” about the guilt of a particular suspect. It is therefore important to understand that fraud victims who have changed their paradigm because they accepted a fraudster as their instructor, who taught them a false paradigm that that led them become victims are no different from the rest of us. Secondly, until they regain a more sensible footing on whom they trust to form their world view, they are helpless to see the fraud for what it is.
Thomas Kuhn’s, The Structure of Scientific Resolutions, illustrates the “stickiness” and resilience of paradigms in the contest of the world of science. Scientists historically rejected new paradigms that suggested the sun as the centre of the solar system rather than replacing the paradigm that the earth was the centre of the solar system by discounting evidence to the contrary as unreliable or misleading. Until the existing paradigm, 1) is faced with overwhelming contrary evidence and 2) there’s a new paradigm that “fits the existing and new evidence better than the old one”, old filters will persist. Without a new paradigm, the brain is left with no choice to use the old filters. Only methodologies are used in the in the absence of new methods.
Consider all the biases we know of —are they not conservative of the present way of thinking? Confirmation bias filters out evidence that does not support what we already believe. Implicit bias, or stereotyping, sees new evidence through the existing patterns of behaviour that we are familiar with. Expert bias, or the Einstein effect, means that experts will use old tools to solve new problems no matter how unsatisfactory instead of simply developing new tools because of the power of the use existing paradigm.
Consider the following hypothetical: I have three water jugs, 3, 21, and 127 litres of water in size. I can fill and empty the water jugs as many times as I want but I must get exactly 100 litres. Solution: fill the one 127 litre jug, pour 21 litres into the second jug, and then pour the remaining water into the 3-litre jug twice. Result: 100 litres (127-100-3-3).
Now make it the hypothetical of 3, 23, and 49 litre jugs – how do we get exactly 20 litres? Those who are familiar with the first solution normally would do 49 – 23 – 3 – 3 to get 20 litres. Those unfamiliar with it simply fill the 23-litre jug and pour 3 litres out of the second jug for the more elegant solution 23 -3 = 20. Paradigms can block the blindingly obvious. And so, to what fraudsters their paradigms (stories) they create. As a result, those who have bought into those fraudsters paradigms will do the same mental gymnastics as those calculating the second jug problem rather than get to the blindingly obvious solution.
As one author noted:
“We may believe that we are thinking in an open-minded way completely unaware that are brain is selectively drawing attention away from aspects of our environment that could not inspire new thoughts. Any data that does not fit the solution, we are already clinging to is ignored or discarded.” 
So, what causes radical paradigm shifts? It is often a mental reflex of one’s unconscious that says “I really love this guy” or “I really love this idea” that causes fraud victims to fall down the rabbit hole of a “reality distortion field” created or adopted by the fraudster for nefarious purposes. That translates into dropping existing or traditional way of thinking about the world in the face of a revolutionary new one that promises great hope of alleviating present discontent. Once down the rabbit hole all of our conservative biases confirm the “new reality” in the face of contradictory evidence. The hardest thing for any person to do is admit they were wrong. Instead, the logical, analytical mind remains in suspended animation until the fraud victim is jarred out of their delusional state by some dramatic event.
In the face of this it is the job of the fraud fighter to show the fraud victims that (a) they are a trustworthy source, and (b) they have fallen victim of an trustworthy source so the victim can be deprogrammed by an intervention that shows the victim a way out of their present predicament beyond simply investing more money in a fraudulent scheme. Without that the fraud victim will simply say that this may be the world that you live in, but they choose not to live in it. Until their rational self takes over, they remain under a spell that makes the idea of suing the fraudster unthinkable.
 Nin, Anais. “Seduction of the Minotaur” (1961)
 Professor Kingsfield. “The Paper Chase”.
 Bilakc, “Why good thoughts block better ones.” Scientific American, Pg 13 (Fall 2020).