Skip to main content
Back to insights

INSIGHTS

Part 05 of 08

The Ask Dan Test: How to spot tribal knowledge before it breaks AI-built software

March 30, 20268 min read

AI made software cheap to build. It did not make the rules easier to get right. And it did not cheapen deep business-logic extraction the way it cheapened code. In many companies, the real workflow logic still lives in operators, analysts, compliance leads, and legacy system owners who can explain the workflow in 90 seconds and still cannot hand you a usable spec. That makes tribal knowledge one of the most expensive hidden dependencies in AI-built software.

Abstract illustration of undocumented expert knowledge being concentrated in a few people while software builds around incomplete rules

Core thesis

Tribal knowledge is not folklore. It is production logic with no address. Coding became broadly accessible; extracting the real logic out of experts did not.

The bottleneck is no longer coding

Most teams still behave as if implementation is the scarcest thing in software delivery. That used to be a reasonable assumption. When engineering time was expensive and code moved slowly, vague logic could survive longer because teams had time to clarify it during build.

That world is over. Code is easier to generate, faster to change, and cheaper to ship. Implementation got democratized. Expert business-logic extraction did not.

This is why many automation failures are misdiagnosed. They look like engineering problems or AI problems. In reality, they are requirements failures: the team built from an incomplete version of how the business actually works.

AI did not remove the requirements bottleneck. It removed the friction that used to expose it early.

'Ask Dan' is a system smell

You can spot tribal knowledge fast. The policy doc says one thing. The workflow tool shows another. The real answer is in a side conversation: 'Ask Dan, he knows when finance overrides that.'

That sentence should set off alarms. It means part of the operating logic still lives outside the system. Maybe orders over $5,000 need finance approval unless the account is a renewal. Maybe Germany follows a different tax path. Maybe a flagged customer skips the default queue. Whatever the rule is, the software will behave differently depending on whether that branch is captured.

If the right answer to a workflow question is 'ask Dan,' Dan is part of your production system. If a workflow depends on remembered exceptions, inbox habits, and unwritten thresholds, the company does not have a fully documented process. It has a human dependency.

You pay for it before anyone leaves

Tribal knowledge is usually discussed like a bus-factor problem. The expensive version is quieter. It happens every week.

PMs rebook the same expert because the first notes missed half the edge cases. Engineers stop for clarifications. QA finds the real exception path after build. Operations keeps overriding the system because the documented process was the official one, not the actual one.

None of this shows up cleanly as 'tribal knowledge.' It shows up as reopened tickets, manual corrections, slow launches, and teams that have learned not to trust the first requirements doc.

A simple rule: if a workflow becomes materially less reliable when one specific person is unavailable, it is not documented well enough to automate safely.

Why documentation theater keeps failing

The obvious response is to demand better documentation. That usually fails for a simple reason: the people who know the most important logic are rarely trained to write specs.

Experts do not store workflow knowledge as polished prose. They store it as examples, edge cases, threshold changes, caveats, and 'actually that depends' branches. The missing rule often appears only when someone asks the follow-up question that exposes it.

That is why experts explain better than they document. A blank template is a bad extraction interface for tacit knowledge. What strong PMs, analysts, and consultants do in conversation is expert-grade extraction, and most teams still cannot do that work consistently or repeatably across every workflow that needs it.

AI amplifies lossy translations

Models can implement what they are given with impressive fluency. They are much worse at noticing the branch that nobody surfaced.

If the requirement says 'send unusual cases to manual review,' a model can build that flow. What it cannot reliably infer is what 'unusual' means, who decides it, what threshold triggers it, or which exceptions bypass it.

That is why tribal knowledge gets more dangerous as AI adoption rises. A hidden rule no longer just creates delay. It becomes a fast path to confidently shipping the wrong logic.

The answer is not better notes. It is business logic extraction

What teams need upstream of build is not better note-taking. They need a way to turn expert explanation into explicit, reviewable logic: conditions, exception paths, unresolved questions, and a trace back to where each rule came from.

A transcript preserves what was said. A usable spec has to preserve the decisions, boundaries, exceptions, and owners the software will actually enforce.

That does not eliminate judgment. In regulated or exception-heavy workflows, teams still need owner review and sign-off. But it does make the real logic visible earlier, while the expert is still in the room and before engineering or AI tools start building from assumptions.

In practice, a vague instruction like 'send unusual cases to manual review' should become something reviewable: which cases count as unusual, who decides, what threshold triggers review, which exceptions bypass it, and which SME statement introduced that rule in the first place.

That is the gap RuleFoundry is built for. It is not a meeting bot, note taker, or transcript cleanup step. It productizes the structured follow-up and live logic-shaping that strong PMs, analysts, and consultants do manually, then returns reviewable outputs before implementation outruns understanding.

Once that logic becomes explicit and traceable, product gets cleaner requirements, engineering gets better source material, QA gets clearer scenarios, and downstream build tools stop building from guesswork.

The Ask Dan Test

  • The real answer to a workflow question is 'ask a specific person,' not a reviewable artifact.
  • Words like 'normally,' 'special case,' or 'manual review' appear without explicit conditions.
  • Different teams describe the same workflow differently.
  • The official process and the real process are not the same.
  • QA or operations routinely discover important exceptions after build starts.
  • One person's absence materially slows decisions, launches, or approvals.
  • Country, customer-type, renewal, compliance, or override branches are not captured in one reviewable artifact.

The bottom line

Tribal knowledge sounds harmless, almost cultural. In practice, it is undocumented production logic. If it stays trapped in a few people's heads, the business pays for it in rework, fragility, and bad automation.

If this sounds familiar, the next useful move is not another documentation push. It is to run one messy workflow through a real extraction process and see what appears: hidden rules, unresolved branches, source-traceable decisions, and the spec package engineering actually needed.

When code gets cheaper, hidden logic gets more dangerous. RuleFoundry is built for that gap: it makes structured, expert-grade extraction more accessible and repeatable, so teams can turn expert explanation into reviewable logic before code outruns understanding.

Bring us one ugly workflow

We will show you the rules, gaps, flows, and source trace that fall out once the logic is actually extracted and made reviewable.