Question Details

No question body available.

Tags

debugging

Answers (6)

Accepted Answer Available
Accepted Answer
November 15, 2025 Score: 11 Rep: 220,779 Quality: Expert Completeness: 30%

What you describe is based on the observation the more often a process or approach has worked in the past, the more trust will we give it for repeated utilization in the future. For example, older software components used (successfully) by a larger user base are more trustworthy than code which has never been used before ("code which I have just written").

This is a well-known psychological principle which is sometimes framed as "trust through reliability" or "reliability heuristic".

These are surely not fixed academic terms, and they are not specific to code or debugging. To be honest, I have doubts this psychological principle has a more specific name in software engineering - certainly not one I have ever heard of. But you can google for these general terms and find some articles (from the field of psychology) about the topic.

I don't know whether this can really help one with the topic of debugging. Like every heuristic, it can fail: even the most frequently used and oldest software components still show up certain bugs after years. Still, when I see a failure in a program where I made a change, I would probably look at the newest and less tested areas (my own changes) first, and the components which have demonstrated themselves to be reliable over years at last.

November 15, 2025 Score: 6 Rep: 12,721 Quality: High Completeness: 30%

I really don't think this has a specific name in engineering, and (although I feel as though I recognise the behaviour which you are refering to...) I'm not sure you are characterising it properly.

I don't think most engineers have a specific "hierarchy" they run through to find problems.

They engage in a search for a solution, and have a set of heuristics (enriched by their particular skill and experience) which controls that search, but they don't necessarily go through the search in a fixed way (unless the problem itself appears to be a common one for which they already have a routinised solution).

I don't think this is a behaviour unique to engineering. The salesman, the police detective, the chess player, the doctor, and whoever else, they all at times are concerned with searching for solutions to problems, without being regarded as "engineers" (and certainly not engineers associated with computing machinery).

If you were looking for a term which describes a particular disposition to thinking in a certain way about things, this is often called "mindset", "mental set", or "attitude". The behaviour itself (rather than the manner in which it is done) is often called "problem-solving" or "fault-finding" - or to use computing jargon, "debugging" (a verb adapted from the act of using a type of now-common computer tool which helps solve problems in software, called a "debugger").

It's unfortunate really that there isn't a great body of academic work in my experience - at least not public, and in English - about how exactly professional computer workers interact with computers and what their jobs consist of, particularly at the cognitive level.

This is probably true for most professions really, although I'm aware serious attempts have been made over the decades to study how exactly doctors diagnose conditions, and replace them with "expert systems" and such (which I gather has been met with limited success).

More effort seems to go into trying to routinise the work of computing professionals and attack their perceived bargaining power (i.e. to change a job that is not well-understood to begin with), often by applying management methods which have been perceived to have worked in a manufacturing context, despite the fact that up to about 40 years ago, manufacturing was awash with craft workers and apprenticeships, and despite the fact that the manufacturing economy and workforce itself has been destroyed in the Anglosphere in those 40 years since (and what remains is typically very sclerotic and expensive to design), suggesting those methods are not in fact particularly effective even in manufacturing.

Works from the 1970s - including famously, Fred Brooks' Mythical Man-Month - tend to be the high water mark of study into computer work as it is actually done.

November 14, 2025 Score: 4 Rep: 46,830 Quality: Medium Completeness: 70%

This appears to be a Philosophical Razor.

In philosophy, a razor is a principle or rule of thumb that allows one to eliminate (shave off) unlikely explanations for a phenomenon, or avoid unnecessary actions.

At its most essential level, this hierarchy of trustworthiness becomes a thought process to optimize your time when troubleshooting a defect. The intent is to avoid looking in areas of your codebase that are least likely to be the root cause, and therefore eliminate unlikely explanations for the bug.

Some other related philosophical concepts:

  • Occam's Razor: "In philosophy, Occam's razor (also spelled Ockham's razor or Ocham's razor; Latin: novacula Occami) is the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements ... This philosophical razor advocates that when presented with competing hypotheses about the same prediction and both hypotheses have equal explanatory power, one should prefer the hypothesis that requires the fewest assumptions" (emphasis, mine).

  • Abductive Reasoning: "a form of logical inference that seeks the simplest and most likely conclusion from a set of observations."

I haven't found a software engineering-specific term for this. The thought process a developer could use generalizes into something more related to the philosophy of reasoning.

November 15, 2025 Score: 2 Rep: 59,562 Quality: Low Completeness: 20%

Experience

This is not an obtuse non-answer. You are describing work experience, i.e. the ability to direct future actions based on historically observed outcomes.

Whether it's a developer pinpointing the source of a bug, an engineer figuring out why an engine keeps stalling, or a detective following leads; their experience influences what they consider the most likely suspects.

Not everyone has the exact same experience. Other people can favor other suspicions, based on what they were exposed to more.

In case you want a more difficult term for the same concept:

Subject matter expertise

The Subject Matter Expert role that exists at some companies hinges on precisely this kind of work experience. An SME's job is to help their peers home in on the source of the issue faster by making judgment calls as to what the likeliest source of the problem will be.

November 16, 2025 Score: 2 Rep: 1,102 Quality: Low Completeness: 50%

"Optimistic debugging" is a term I and former colleagues have used. It doesn't specifically name your hierarchy of (un)trustworthiness, but a more general idea that encompasses those suspicions.

This is based on some Raymond Chen's OldNewThing blog posts. Upon checking my source, I've been reminded that the actual phrasing he used is "debugging is an exercise in optimism."

The principle, as I understand it, is that you hypothesize (based on knowledge and experience) how the program got into the bad state, and you optimistically assume that the hypothesis is correct in order to find more evidence closer to the root cause. The linked post demonstrates that.

I'd argue that the "experience" bit includes recognition that the most-recently changed code is likely to contain either the bug itself or evidence that points toward the source. Of course, that's just one factor you'd use in forming a hypothesis.

November 15, 2025 Score: 1 Rep: 4,647 Quality: Low Completeness: 50%

I am tempted to say "Debugging Skill" or "Debugging Experience"

The reason is a proof by contradiction, consider someone who is highly skilled at debugging (regularly identifies the root cause of a problem) in a very efficient/timely way.

Now assert that this engineer doesn't do / doesn't have the "hierarchy of trustworthiness" that the question asks, how are they able to achieve the result?

On any non trivial problem, I think the hierarchy is a required part of the efficient debugging (cannot be separated from it) to say someone is Skilled at Debugging is to say that they use the hierarchy concept.

TL;DR - I don't need another name for this ability, saying "skilled at debugging" encompasses it, as such one might conclude there is no commonly accepted answer to the question.


I concede there are other skills necessary to be good at debugging such as:

  • A logical pattern
  • An ability to draw conclusions.
  • Knowledge of (debugging) tooling
  • Experience of what typically goes wrong

However, I don't think this detracts from my main point.