In a recent discussion on The Future of Coding Slack my understanding of abstraction was challenged. As I currently write about categorization, this was a good reason to reflect on what abstraction means to me. The word gets thrown around lightly in technical discussions. We’re susceptible to just assume that our intuitive understanding of a word is sufficient to wield it as a weapon in debate.
What is abstraction?
My understanding of the process of abstraction is that it is fundamentally about
removing ignoring detail.
To abstract means to generalize — moving from a more specific concept up a taxonomy to a more generic one in the same domain. If you jump between different domains, I would call that translation or transformation instead. Abstraction is a kind of transformation, which requires “removing detail” and “staying within the same domain”.
apple and a
banana. In our English folk taxonomy we classify both as
Fruit is a higher-level concept, ignoring some of the details that make specific fruits different.
We abstract to generalize more specific concepts. We ignore some details that distinguish the specific concepts and lift the commonalities into one higher up the hierarchy. With the now abstracted, simpler concept, we can think and communicate more efficiently. We no longer have to deal with the details removed. We reduce cognitive burden. We simplify.
If we can agree on
fruits, we expand our language with additional meaning that allows us to jump between levels of abstraction whenever convenient. We can talk about how eating
fruit is considered healthy, without having to list specific fruits. On the other hand, we can still walk into a supermarket and ask for an
apple, if we desire to buy one.
Confusingly, in a taxonomy of definitions, abstraction is therefore a more specific concept than transformation.
Transformation assumes less about the thing it defines. It allows more things to be
abstraction allows things to be
Abstraction is a more specific
transformation in that it expects the removal of detail and ignoring specifics as additional requirements for something to be an
Not my definition
This is not a definition! I just explained how I understand the concept. I don’t claim to be an authority capable of defining that term. And I don’t need to. I can just look it up. What I wrote above is merely how I interpret and understand the definitions in a [dictionary] and on [Wikipedia].
As programmers, we define things all the time. When writing or speaking human language however, I find introducing my own definitions redundant and potentially dangerous.
A common ground for understanding
Am I old-school for using a dictionary and Wikipedia to check my understanding? I can’t help it, perhaps because English is not my first language. I use the system-wide Look Up feature in macOS and iOS all the time.1 Just like with APIs, I want to make sure I use the right one.
However, communication is accelerating and there’s no time for that. Our intuitive understanding has to be sufficient, and unconsciously we’re quick to elevate it to being the only possible interpretation.
Let’s just define things the way we want! I mean, everybody is entitled to have their own opinions, right?
Yes, but if we all just make up our own definitions, how can we ever expect to understand each other?
We need some common ground to start from. If we can agree on a definition in a dictionary or on Wikipedia, then we can explain how we understand it. We might interpret the same definition differently. Likely that is because we need to recursively agree on more definitions for more words used in the definition. In programming, you at least get a compiler error if you’re not precise enough and your types don’t match. Human language doesn’t even throw runtime errors. Over time, you mess up all your state, get confused, and might not even notice.
In my experience, most communication issues are rooted in a lack of agreement on how we understand a concept. We simply don’t understand each other. But we think we do, then it gets emotional, then we wind up screaming at each other, and nobody wants to look at a dictionary anyway.
Today, “everybody is entitled to their own opinion” (yes, you are), often gets interpreted as “facts don’t matter; they’re just your opinion”. This way, we are robbing ourselves of the common ground needed for mutual understanding. Facts used to be something we could agree on. They come out of shared observation or a scientific method that most of us used to accept as one of the best ways to create shared understanding.
If we can no longer agree on some things being more reliably valid than others, then we will no longer be able to create the common ground we need to be able to communicate.
With so much communication going on these days, we can’t possibly strive for that level of precision in everyday life. Thankfully, we don’t have to. We can just unfollow, block, or mute what we don’t like. Problem solved.
Well, not really solved.
For mental health reasons alone, make use of all these options generously. Don’t confront everyone. Pick your battles.
What is important to you?
Is there a chance for a fruitful conversation?
And if there isn’t, are you debating in a public forum so that even if your opponent won’t listen, can others learn from the debate?
If we no longer seek debate with people who happen to have different opinions and perspectives than us, and just retreat into our comfortable bubbles, we no longer create shared understanding. We no longer identify starting points we can agree on and have meaningful discussions about how we each come to different conclusions starting from there. If we lose that common ground, we’re losing the foundation our society is built upon.
Oh well… I’m digressing.
TL;DR: Next time you end up in an argument, check if it’s grounded in a misunderstanding. If it is, find some common ground to agree on and then try to understand each other! Worst case scenario: you sharpen your own understanding.
- Steve Krouse: Future of Coding Community
Come and hang out with us in this Slack of enthusiasts interested in the future of coding.
- macOS User Guide: Look up words on Mac
A built-in feature of macOS (and iOS) to look up any word you can select in any app and quickly get a definition for it.
I once learned in a personality test that I am apparently very particular about words and expect myself and others to be much more precise with them than we usually are. I’m learning to deal with it. The plus side is that sometimes I get feedback for being “good with words”, which I find quite strange, as I don’t consider myself a good writer. Anyway, that probably also explains why diving deep into linguistics had such an appeal to me. ↩︎