————————-Via Negativa (My Philosophical Notebook)——————-

Featured Post Image

Below is my philosophical notebook. Everything here is a work in progress; I welcome and encourage comments. This is my thinking at it’s roughest (expect but ignore typos). The “Articles” tap contains articles I’ve written as a “science writer.” They are much less philosophical and represent an earlier period of my thinking.

The Adverse Effects of Stability (Notes from Antifragile)

For organic systems, a certain degree of instability breeds stability. Small natural forest fires prevent bigger ones. Jellyfish, one of the oldest species on the planet, thrive because they benefit from climate fluctuations. We inject children with disease-causing microorganisms (vaccines) to improve their health. Biological longevity, in other words, arises from a small dose of harm.

Problems quickly surface when we attempt to eradicate all potential sources of harm. Consider the soccer mom, who calls the doctor when she detects even the slightest fever. Can you imagine growing up in such a sterile environment and then moving to a sprawling, germ-infested metropolis? Imagine an assiduous co-worker who arrives at 8am precisely every day. How would you react if, one day, he shows up a few minutes late? In finance the longer a bull market lasts the more a crash will come as a surprise. In sum, when we erase instability, we breed artificial stability, which inevitably breaks—sometimes catastrophically.

Meanwhile, business schools are churning out MBAs who recommend management strategies designed to eliminate errors. Innovation, of course, emerges from error—mistakes are a vital source of information. The Silicon Valley mantra “fail early, fail often” embodies this virtue. As Robert Suttons suggests in his gem Weird Ideas That Work, “[Companies should] encourage people to keep generation new ideas… and to avoid reverting to proven ideas and well-honed skills, rewarding success isn’t enough; you have to reward failure as well.”

(These are notes from a section in Taleb’s Antifragile)

75 Things Every Popular Psychology Book Must Include

  1. Wagner Dodge
  2. Elliott/Antonio Damasio
  3. Phineas Cage
  4. IDEO
  5. Iowa Gambling Task
  6. Gary Klein’s firefighters
  7. Any K&T study
  8. Israeli Dare Care Study
  9. Marshmallow task
  10. Black Swans, Fat Tails, Dragon Kings
  11. Phil Tetlock
  12. Narratives, Storytelling
  13. LTCM, Tulip Mania, South Sea Company, Mississippi Company
  14. Dunbar #
  15. Haidt’s Elephant/Rider metaphor
  16. Lake Wobegon effect/Illusory superiority
  17. Intrinsic versus extrinsic motivation (via Pink/Amabile)
  18. Growth versus Fixed minset (Dweck)
  19. Grit/Duckworth
  20. Flow/Csikszentmihalyi
  21. Deborah Gordon’s Ant research
  22. Jane Jacobs & William Whtye
  23. Goldilocks zone
  24. Google 20 percent time
  25. Whole Foods
  26. Granvetter’s Theory of weak ties
  27. Homophily
  28. Forer Effect
  29. Charles Mackay
  30. Milgram’s experiments
  31. Asch’s studies of conformity
  32. Invisible Gorilla/Change blindness/Attention blindness
  33. Opt-In Donor study
  34. Apollo/Dionysus
  35. Zipf’s Law
  36. Small-world problem
  37. Dunning-Kruger Effect
  38. Paradox of Choice/Schwartz
  39. Red Queen effect
  40. Baader-Meinhof Phenomenon
  41. John Bargh’s research
  42. Stroop Test
  43. Cocktail party effect
  44. Joshua Bell
  45. Fundamental Attribution Problem
  46. Tall poppy syndrome
  47. Murphy’s Law/Muphry’s Law
  48. Pythagorean Cup
  49. Placebo Buttons
  50. Stale Popcorn study
  51. Edison’s Failure Quote
  52. Picasso’s quote on solitude and staying creative
  53. Introverts/Extroverts/Cain
  54. InnoCentive
  55. Successful P&G product story
  56. Oxytocin/Serotonin/Dopamine/Mirror Neurons/PFC
  57. Broken Window Theory
  58. Arthur Fry
  59. Motivated Crowd Theory
  60. Apple/Zappos/Netflix
  61. Steve Jobs
  62. Confirmation bias, cognitive dissonance, motivated reasoning
  63. Malcolm Gladwell
  64. 10,000 hour rule
  65. Intuition
  66. Reason
  67. The Halo Effect
  68. Adam Grant
  69. Complex Systems/Chaos/Emergence/Complex adaptive systems
  70. Phil Zambrano
  71. Cognitive diversity/Scott Page
  72. Brian Uzzi and Jarrett Spiro/collaboration
  73. Groupthink
  74. Abilene Paradox
  75. Desirable difficulties/Elizabeth & Robert Bjork

Business Leaders Can Learn From Engineering Failures

Henry Petroski is a professor of history at Duke University with a dark specialty: engineering failures. His first book, To Engineer is Human, dissects the anatomy of several disasters, from the Tacoma Narrows Bridge to the walkways at the Kansas City Hyatt Regency Hotel. If you’re squeamish about flying or driving over bridges, this book won’t make you feel better. But it’s a captivating window into how engineers think, valuable for anyone in business.

Engineers are natural skeptics. They treat each new engineering project as a hypothesis to be disproven. By imagining a structure under every conceivable situation, engineers are forced to think in the negative. How could this building collapse? How could this bridge fail?  What could go wrong? Even a structure as rigid as the Brooklyn Bridge should be treated as an accident waiting to happen. That it has stood for over one hundred years is no guarantee that it will stand tomorrow, despite its structural soundness today.

To think like an engineer is to think critically about success and failure. If you’re seeking business wisdom but you’re sick of the business aisle, Petroski’s books are a good place to look. I’ve curated five insights from To Engineering is Human, Success Through Failure, and his latest, To Forgive Design.

1)  Success Does Not Imply Soundness

The I-35W Bridge in Minneapolis stood for more than 30 years before it collapsed in 2007, killing 13 people. An investigation discovered that the steel gusset plates, which secure the trusses, were undersized. To make matters worse, two inches of concrete were added over the years, increasing the load weight by 20 percent. Even though engineers examined the gusset plates during routine inspections, nothing was visibly wrong.

Lesson: Assume weaknesses exist, even if you can’t see them. Find them before they hurt you.

2) Be Aware of “Organizational Drift”

In 1855, engineers completed the Niagara Falls Suspension Bridge, the world’s first working railway suspension bridge. The bridge was a success, but it led to an embarrassing failure nearly 80 years later. Suspension bridges became popular, and every new bridge introduced a small, seemingly innocuous change, that sacrificed safe engineering practices for slender, narrower and more stylish looking decks.

The Tacoma Narrows Bridge, completed in 1940, was especially slim and flexible. At the time, it was the third longest suspension bridge in the world, and it was designed to withstand 100 mph winds. However, it could not hold morning traffic in a forty-two mph crosswind one November morning. All suspension bridges move in the wind, but the Tacoma Narrows Bridge was too light, and it collapsed four months after it was completed.

When a design works well, it’s natural to adopt it elsewhere, again and again, each time with a change. The small incremental changes add up, and all of a sudden they result in disaster.

Lesson:Beware of “Organizational Drift,” the tendency for companies to move away from their original focus too slowly for anyone to notice.

3)  Good Engineers Crave Counterexamples

The psychologist Gary Klein talks about the pre-mortem. Before you start a project you should imagine the following scenario: “It’s a year later, we’ve done the project, and it’s been a massive failure.” We’re more likely, as Robert Sutton explained to me in an interview, to imagine a more detailed and accurate future when we’ve considered worst-case scenarios from a future perspective.

Lesson: Before you start a project, conduct a pre-mortem. Imagining failure is vital.

4) The Innovator’s Dilemma

In 2009, a thirteen-story apartment building in Shanghai collapsed, nearly intact. Oddly, the design and construction of the building had nothing to do with its demise. After the building was completed, workers excavated a deep hole on one side to make room for an underground parking lot. They trucked the excavated dirt around the building and piled it on the ground, creating a thirty-five foot heap. Heavy rains saturated the dirt, putting unexpected lateral pressure on the building’s foundation, which began to shift. Eventually, this asymmetrical strain caused the building to fall on its side like a domino. One worker was killed.

After a highly visible engineering failure, our knee-jerk reaction is to blame the design or the designer. The disaster in Shanghai had nothing to do with either. “Had its designers known that the piles would be subject to sideways pressure, they would have made them larger and thus more resistant.” Sound business practices fail for the same reason: unforeseeable, external pressure. In a passage that could come from Clayton Christensen’s The Innovator’s Dilemma, Petroski writes that, “Even if a building is well designed structurally, it can still succumb to failure through no fault of its own.”

Lesson: Expect unforeseeable external events to undermine sound business plans.

5) Failure is Good (In the Long Run) Because it Reveals Latent Errors

In a lecture at Case Western Reserve, Petroski imagines what would have happened if Titanic did not crash into an iceberg. Titanic could have safely crossed the Atlantic thousands of times without sinking. If it had, engineers would have concluded that the design was sound, and built an even bigger ship. The latent errors responsible for killing 1,500 people on Titanic—too few lifeboats and thin bulkheads—would have been ignored, resulting in more deaths, not less.

Smart engineers know not only which methods work but which methods have failed and why. The same is true of successful entrepreneurs. The best business books I’ve read this year—Ben Horowitz’ The Hard Thing about Hard Things comes to mind—are filled with practices that we should avoid, not just business platitudes (skin in the game) and truisms (have a bias for action) that we’re instructed to follow. While we can’t plan for every possible failure—engineering catastrophes will happen as long as we build, just as bankruptcy is an important part of the economy—we can avoid mistakes we’ve already committed.

Lesson: Acknowledge and value failure so that you can learn from your mistakes. Conduct postmortems, just as FAA investigates every aviation accident.

(Pitched this to FC, they weren’t interested. Might pitch to Fortune, eventually to 250Words.)

Problem of Induction in Engineering

Reading Henry Petroski’s To Engineer is Human. Excellent passage here:

The past success of an engineering structure confirms the hypothesis of its function only to the same extent that the historical rising of the sun each morning has reassured us of a predictable future…. [For example] the structural soundness of the Brooklyn Bridge only proves to us that it has stood for over one hundred years; that is will be standing tomorrow is matter of probability, albeit high probability, rather than one of certainty.