[aCASThinkers] tend to possess admirable cognitive abilities; such as: Inquisitiveness; Curiosity, Self-checking & self-awareness; and the like.
One useful way you, the reader, can manifest such traits at critical moments in your career is to raise the question:
“How do {I/you} know what {I/you} think {I/you} know?”
This powerful question is an exemplar of an
applied cognitive loop that promotes early error discovery and self-correction.
So-called “Epistemic Rationality” advocates this [simplified] aspiration:
“Care only about truth and accuracy…
what is true, factual, and well-supported…
and the process(es) for determining same.”
Which also influences the expectations of “Professional Skepticism” in fields such as law and accountancy…and are ideal tenets suitable for all professional contexts.
When considering the question “How do {I/you} know what {I/you} think {I/you} know?” consider at least three perspectives:
  1. Critical Thinking: Skepticism regarding the claim;
  2. Analytical Thinking: An organized, data-centric approach to gaining an understanding of the claim; and
  3. Systems Thinking (in an illustrative business context): Evaluate the end-to-end business; the marketplace; the industry; channels-of-distribution; supply-chains; general economic conditions; marketing; packaging; product; pricing; consumer preferences; retailer practices; et cetera.
Based on decades of empirical observations, there is compelling evidence [aCASThinkers] possess admirable cognitive abilities and tend to be distinguished and distinguishable by many characteristics; including being:
  • Inquisitive,
  • Observant,
  • Curious (especially regarding “How things work”),
  • Introspective,
  • Questioning,
  • Thoughtful,
  • Purposeful,
  • Doubtful,
  • Skeptical (for the most part, in a polite, professional fashion),
  • Reluctant to accept things (e.g., claims, conclusions) “at face value,”
  • Independent in thought, and
  • Incessant regarding confirmation/disconfirmation (e.g., facts, evidence, context, analysis).
Such traits manifest a cognitive loop that promotes early error discovery and self-correction. This loop is sometimes referred to as “How do {I/you} know what {I/you} think  {I/you} know?” (See “Epistemic Rationality,” below.)
  • Unfortunately, such seemingly admirable traits too-frequently lead to attendant problems.
Many people (empirically: frequently those in positions of power) tend to dismiss evidence-based approaches to understanding and solving problems in favor of politically-powerful [non-]solutions.
  • If you have yet to encounter this form of resistance, as an [aCASThinker], it is highly likely you will.

Epistemic Rationality

For the curious: “How do you know what you think you know?” is a concept that emerges from a style of purposeful thinking known as “Epistemic Rationality” (sometimes expressed as “Epistemological Rationality”).
Although there exists an abundance of weighty, in-depth, philosophical tomes on the topic, the gist of Epistemic (Epistemological) Rationality is quite simple:
“Care only about truth and accuracy…
what is true, factual, and well-supported…
and the process(es) for determining same.”

Challenge Assumptions

Challenging Assumptions is a key capability of determining “How do {I/you} know what {I/you} think {I/you} know?”
  • [Subscribers: When it arrives in your sequence, be sure to read [aCAST][Insight][1016] – When Is the Best Time to Cancel a Doomed Effort?]
An uncountable number of professional careers have been ruined by relying on assumptions…that were incorrect. Many times, it is not what you don’t know that becomes an issue…it’s what you believe you know that isn’t so.
To this point, an exercise for the motivated:
  1. Who invented the lightbulb?
  2. Who invented the telephone?
  3. What as the last name of the pilot of the world’s first powered airplane flight?
(Spoilers: 1. Not Thomas Edison; 2. Not Alexander Graham Bell; 3. Not Wright (as in, Wright Bros).)

The Importance of Professional (and Personal) Skepticism

  • [Subscribers: When it arrives in your sequence, be sure to read [aCAST][Insight][1011] – Evidence: Developing, Evaluating, and Contesting.]
Whereas some professions (e.g., accounting, law) have adopted formal expectations for “Professional Skepticism,” the tenets of this ideal are suitable for all professional contexts.
Minimally, Professional Scepticism means:
  1. Having a questioning mind,
  2. Being alert to anything that may indicate misstatement due to error or fraud, and
  3. Critically assessing evidence.

[aCAST][Applied]: Putting the Applied in [aCASThinking]

(The following example is from my personal history*. It is factual, yet edited for length and readability. Do you have an illustration of [[aCASThinking]] you’d like to contribute for publication? Drop an email to rick@acast.pro.)

“It’s Obvious [Idiot]! We Need a New Marketing Campaign!”

Years ago, I was the “Data Geek” member of a consulting team engaged by a large-volume manufacturer and distributor of certain consumer items.
The client’s SVP of Marketing opened the “Kick-off” meeting with our team with the claim “Sales are down. We need a new marketing campaign.”
For several moments—supported by an impressive array of slides, charts, and graphs (for the cognoscenti: complete with twenty seven eight-by-ten color glossy pictures with circles and arrows and a paragraph on the back of each one; OK, that part wasn’t true, but double-secret bonus points if you know the source)—the client (SVP) forcefully spoke about their current marketing campaign, his perception of the campaign’s many failures, and his expectations for the new campaign our team was engaged to develop.

“Captain, There Are Always Alternatives”

As the SVP’s diatribe wound down—to the visible dismay of the lead consultant—I asked, “Given the uncountable number of possible causes for a decline in sales…well…? Have explanations other than Marketing Campaigns been considered?”
Inaudibly, the consulting team lead groaned, shook his head, then glared at me.
Angrily, the SVP faced me, pointed to a graph, and snapped, “It’s obvious. [He didn’t say “Idiot,” but everyone in the room could hear he thought it.]” He tapped the graph, and added, “We need a new marketing campaign.” He waved at his team of Marketing Analysts, and said, “Everyone knows that.”
[Editorial: Whenever I hear someone claim (something to the effect of) ‘…everyone knows that,’ I have found—from painful experience—that [a] everyone does not know that, and [b] whatever ‘it’ is…is probably distorted, incomplete, and/or highly political to the point no one wants to challenge it, wrong, or little more than a fact-free desire.]
After our team lead nervously intervened to mollify the client and restore order, the SVP left and we laid out a series of tasks to immediately address with the client’s staff.

Raw Data

As the designated Data Geek, one of the first things I requested from the SVP’s staff was access to all engagement-applicable raw data (also known as “source” data).
The client’s staff responded by handing me photocopies of the same graphs, graphs, and slides the SVP used in his opening remarks.
Politely, I demurred.
“Why?” The client’s POC (Point-of-Contact) demanded. “My team already crunched the numbers.” He pointed to the clutch of charts and graphs.
My reply extolled the principles of “Due Professional Care,” “Independence,” and other elements of professional ethics…ending with my re-affirmed need to access the raw (also known as “source”) data.
The client’s POC was irritated, but acquiesced and made the necessary arrangements.

Crunched Numbers

At the conclusion of a few days of “Data Jedi,” (at least) four conclusions regarding the decline in sales were inescapable:
  1. Marketing was actually over-performing in attracting new customers;
  2. Customers were abandoning the company’s product in favor of the competition’s at elevated—and rapidly increasing—rates;
  3. The decline in sales corresponded to a marked reduction in the quality of the product caused by the firm’s somewhat recent initiative to reduce product cost by ~30-50%; and
  4. Because it wasn’t the problem, changing the Marketing Campaign would be unlikely to remedy the sales decline.
  • While it can never be known with certainty (which is an example of “Parmenides Fallacy,” subject of a future [Digest]), it is entirely possible that any change to the Marketing Campaign would increase the rate of decline in sales.
  • If so, it would render any investment of time, effort, and money in development of such a campaign(s) wasted.
  • Worse, the “opportunity cost” would be material and unnecessary.

Myopia and Missing the Important Part

Within the scope of their “Number Crunching,” the client’s team did everything right—and, yet, got the wrong answer.
Why?
  • Because they were “Marketing Analysts,” they focused only on Marketing and marketing data.
  • They did not perceive of the business as a “whole system;” and, they did not consider possible non-marketing causes for the decline in sales.
  • On the engagement’s second or third day, I asked about such factors as: changes in competition; changes in consumer preferences; changes in supply chain; changes in distribution chain; changes in retailer practices; changes in manufacturing; changes in procurement; and the like.
  • As I recall, except for considering changes in consumer preferences, the Marketing Analysts did not evaluate any of the other possible causes.
  • Also, because the initiative to reduce product cost by ~30-50% was instituted nearly a year earlier, the Marketing Analysts did not pause to consider the possibility of a relationship between the (long-forgotten) initiative and current results. (Overlooking the latency between introducing a change and observing results is a far-too common failure that is the subject of a future [Digest].)

Putting the “Applied” in [aCASThinking]

  1. Critical Thinking: Skepticism regarding the claim that a new marketing campaign would restore sales. Insistence on accessing raw (source) data. Independent data analysis (Data Jedi).
  2. Analytical Thinking: Adopting an organized, data-centric approach to gaining an understanding of the problem (sales decline) that included identifying historical changes (e.g., the initiative to reduce product cost by ~30-50%) which might have an effect on current performance.
  3. Systems Thinking: Considering the greatest number of possible causes for a sales decline; including: the end-to-end business; the marketplace; the client’s industry; channels-of-distribution; supply-chains; general economic conditions; marketing; packaging; product; pricing; consumer preferences; retailer practices; et cetera.

If you found this [aCAST][Digest] interesting and valuable, perhaps you’d like to forward it to friends and colleagues?

*Why do I publish events from my personal history?
1. Because I was there and (for the most part) know what I did and was thinking and why.
2. More to the point: I am ACUTELY aware of my failings. While I am reticent to criticize others, I rush to do so for myself, and hope you find some benefit from my errors.