General By Default
Whenever communicating, I default to being general instead of specific. I find that while this makes my communication less concise than it could otherwise be, it results in the ideas I’m communicating coming across more accurately.
Specific examples should only ever be given for the sake of clarity, as it’s often easier to wrap one’s mind around a specific instance than a generalization. Dealing in generalities and abstractions comes with a number of idealistic and pragmatic benefits, and few downsides if done with care. In this post I go over some of the ways in which I try to be general by default.
I use singular “they” regularly, even when the gender of the referent is known. The way I see it, if the gender of the person being referred to is not relevant then drawing attention to it by using gendered language is distracting from the intended points, and might even introduce unwanted bias in some cases. I’ve found lots of resistance to this, from being called out in casual discussions, to significant penalties on school assignments. Even so, I think the concept of gender in general has little importance in the modern day, except when matters of how it’s misused (i.e. gender-based discrimination) comes up. To be clear, I am not advocating that we be gender-blind - we must accept and account for the realities of gender-based discrimination. But I am advocating that we only bring gender into a conversation when it is actually relevant.
I find there are numerous gains to this. The most obvious of these being the reduction of gender-based biases in some cases. That said, there is also a lost opportunity here as instead of using gender-neutral language to avoid biases, one could employ gendered language so as to challenge biases head-on. Using gender-neutral language can reduce heteronormativity by avoiding making assumptions about the gender of one’s partners. It also doesn’t presume a gender binary, which is a huge win for non-binary people. Whether you believe the only genders are male and female, or you believe otherwise, the use of singular “they” is compatible with your belief, so it’s less abrasive overall. Writing “they” also takes fewer character than writing “him/her” or “he/she”. Fewer syllables too. Furthermore, the order of the genders in pronoun lists like “him/her” furthers the notion that male is the default, as the male pronoun almost always comes first. Using singular “they” handily avoids this ordering issue. It’s also better than using singular male pronouns as a singular gender neutral pronoun, as that reinforces the notion that male is the default gender.
A perceived/presumed disadvantage of the use of singular “they” is that is takes longer to read/understand text which uses it, but research shows that clauses with singular “they” are read just as quickly as clauses containing gendered pronouns when those pronouns match the stereotypical gender of the antecedent, and are read even faster than clauses containing gendered pronouns when those pronouns go against the stereotypical gender of the antecedent. This is a pragmatic reason to use singular “they”, which is a nice bonus, but I’m more interested in whether singular “they” should be used as a matter of principle. The same research also shows that clauses which refer to a particular individual whose gender is known using singular “they” reading will be slower than if gendered pronouns had been used, so long as those pronouns match the stereotypical gender of the antecedent. Even so I support the use of singular “they” in general in such cases, as it helps to combat the gender stereotypes.
Some would argue that to combat the notion of male being the default gender we should make a point to use feminine pronouns as singular gender neutral pronouns, and to place feminine pronouns first in lists of gendered pronouns. Many RPG sourcebooks go this route, and while I understand and agree with most of their arguments/objectives, I still generally prefer singular “they”. Most of the arguments I’ve seen against it in favour of using feminine pronouns (or a random mix of masculine and feminine pronouns) is that singular “they” is not widely accepted. This counterpoint falls flat when one considers that the use of general singular feminine pronouns is not at all widely accepted either. Instead of trying to make people believe that both male and female are acceptable default genders, we should do away with the notion that a default gender exists - we should be general by default.
The only real issue I see with singular “they” is that it can creates ambiguity since it can be used both singular and plural pronoun. It’d be nice if there was a better solution to this issue: a family of words that specifically were gender neutral pronouns and nothing more. Unfortunately English is a natural language, and it’s difficult to make a widely accepted deliberate addition to a natural languages. In any case, I don’t even think of this issue as particularly damning because of my gripes with how pluralization is handled in English.
Another case where I try to be general by default is with pluralization. I’ll use the plural form of quantifiable nouns in most cases except when I’m specifically referring to a singular entity, as there are only two options (singular and plural), and the latter is much more general than the former. Singularity is only applicable when there is exactly one of something, but plurality is applicable in ever other case! Even when you have 0 of something. I wish English (and other natural languages) had made singularity the special case, and plurality the default. Instead we typically have to modify nouns to indicate they’re plural, e.g. by appending a “s”.
As an aside: plurality works much the same way in English, German, Portuguese, Spanish, Danish, and many other languages, all with 2 ways of specifying quantity: plural and singular. There are, however, many natural languages which handle plurality differently. Slavic languages have 3 ways of specifying plurality. Welsh has 4, and Arabic has 6. I’m not familiar with most of these languages, so I can’t say too much about them, but I do know French, and it handles plurals in a different way despite only having 2 forms like English. In French, anything that is strictly between -2 and 2 is singular, and everything else is plural. I’m not sure if I like this more or less than how English handles it. On one hand it makes the unmodified form (i.e. singular form) of quantifiable nouns more general, but it does this by making the singular form less precise, rather than changing the unmodified form to be plural, and having a modifier for the singular form.
Similar to the issues surrounding needlessly gendered language mentioned above, I prefer the term “humans” or “human race” or “human species” over the term “men” when describing the humans. While (historically) the term “men” has referred to the species, it can be ambiguous because it could specifically be referring to the males of the species. While context is usually enough to disambiguate, I still prefer using more general terms that avoid the ambiguity altogether.
Another example of how I apply this idea of being general by default is that I never write something about “humans” unless I’m writing about something particular to the human species. Instead of “human” I usually use the word “person”, as in addition to being an adequate descriptor for most human beings, it can also apply to alien lifeforms, and “artificial” lifeforms
Generally, when crafting philosophical ideas, one ought to avoid the idea that humans/people are special unless they’re also able to explain why and how they’re special. Without an accompanying case for why humans/people are more than mere regular matter, a philosophical theory shouldn’t treat them as anything more than that. Granted this runs against our pre-philosophical beliefs, but those often lead us astray. It’s frustrating how anthropocentric most humans tend to be - their arguments lie on unjustified foundations. There are certainly several ways in which one can make the case for the specialness of humans, but that case should actually be made rather than assumed, because when it’s relevant it tends to be a massive and vague assumption.
As an example, consider a teleporter which consists of a transmitter and receiver. The transmitter scans an object, disintegrates it, then transmits the information it scanned to the receiver. Then the receiver uses the information sent to it to construct a perfect copy of the object that had been disintegrated. “Perfect copy” meaning that there is no way to distinguish between the object that was on the transmitter, and the object that winds up on the receiver, save for the fact that they have a different position, no matter how advanced your sensors are. Many (most?) people would contend that if you put a plain rock or some other simple inert object through the teleporter, you would get the same object at the receiver. Many of those same people would also contend that if you put a person through the teleporter, you would not get the same person at the receiver. I don’t have any inherent issues with this position - I don’t agree with it, but it can be rationally defended - but it must be explicitly defended, because it’s not at all obvious. It is only obvious in any sense if you have a case for why a person is different from non-person matter, and if you have such a case, make it.
So far I’ve only discussed this notion of generality in the context of writing with natural languages, but it also applies in much the same way to writing with programming languages. When programming what you’re doing is you’re communicating (to the computer, and to people who will read the code in the future) in a formal language in order to express some idea accurately, but not necessarily concisely. Reality is that most important code is living code that changes over time, but as it changes it tends to retain the core idea it was originally meant to express. Changes are usually born from the specifications changing, or the implementation changing to better meet the same specification (e.g. performance improvements, bug fixes, etc.).
Since most important code is living code which changes over time, it helps to write it in a way that is amenable to changes. Writing code that’s amenable to any changes no matter how strange or drastic is impossible, but we can write code that is amenable to changes that retain the essence of the original idea that the code was meant to express. You want to choose the right abstraction for the task at hand, while keeping possible higher abstractions (i.e. generalizations) in the back of your mind so that they may subtly guide your design. I’m not suggesting you write code that’s overly abstracted/complex - doing so will actually make the code less amenable. Instead, whenever there’s a choice of how to implement something, and the options are all roughly the same complexity, preference should be given for the option that best captures the essence of the idea the code is meant to express.
For example, consider the following Python code:
def do_something(x): if type(x) is dict: do_something_with_dict(x) ...
In this little toy example we can imagine calling
do_something with a
dict and it would work as intended, but what if we called it with a
defaultdict instead? Except in very particular situations we would want the same behaviour to be the same as when it’s called with a
dict. To achieve this we could do the following:
from collections import defaultdict def do_something(x): if type(x) is dict or type(x) is defaultdict: do_something_with_dict(x) ...
But this runs into the same problem if we call it with other
dict subclasses, such as
UserDict. Even if we checked against all of the
dict subclasses in the standard library our code wouldn’t work for user-defined
dict subclasses or for any
dict subclasses added to the standard library in the future. Therefore we should use
isinstance instead, which checks if the first argument is an instance of a subclass of the second argument (n.b. a class is contained within the set of its subclasses):
def do_something(x): if isinstance(x, dict): do_something_with_dict(x) ...
This helps address the problems identified, but what if
x was a dictionary-like object that was not an instance of a subclass of
dict? If we still want to treat it like a
dict, then we’ll need to use the following code instead:
from collections import abc def do_something(x): if isinstance(x, abc.Mapping): do_something_with_dict(x) ...
The classes provided in the
abc module (such as
Mapping) make use of
__subclasscheck__ to perform custom handling for
issubclass. Instead of having to be a subclass of
dict, now anything which has the following attributes is considered a
dict, and all (proper) subclasses of
dict, and also includes
dict-like classes which do not inherit from it. It is the most general approach, and typically the most correct approach.
You may be thinking that this is all well and good, but ultimately doesn’t matter much. Certainly the ideas expressed here aren’t terribly important in terms of material impact on people, or in terms of how much they influence the day to day lives of most people, but I contend that they’re still important in the grand scheme of things, albeit in a subtle way.
The structure of our languages influence our thoughts. That is, essentially, the weak form of linguistic relativity, which is supported by empirical evidence. Nowhere have I seen this idea better expressed than in Ken Iverson’s Turing Award lecture, “Notation as a Tool of Thought”. Ken Iverson invented the programming language APL, which ultimately lead to the creation of the language q, which I’ve used extensively in my career with KX. I can attest that changing the structure of a programming language can have a great deal of influence over how we think, and I see no reason why the same phenomenon would not carry over to natural languages. See also Paul Graham’s post about the difference in the expressive power of languages.
I believe that the kind of thinking I’ve outlined here can subtly influence how you think (in good ways), as well as how those around you think (in good ways). You might think I’m using the word “subtle” here as a weasel word, since if it’s subtle enough then it becomes impractical to provide data to prove it, and I just so happen to lack any solid data to back up my claims of subtle influence. I could provide anecdotes, but that’s practically worthless when the request is for solid data. To this I’d say that despite the lack of proof it at least seems reasonable that my claims are true, and there is essentially zero cost to acting on the assumption that they’re true. Considering the potential benefits along with the lack of cost, it seems like these claims can be held as a justified belief.
This kind of general by default thinking also has the benefit of being progressive in most cases. Employing specifics unnecessarily in your writing introduces the possibility of it becoming dated over time as certain words and expressions fall out of favour. By remaining general many of these pitfalls can be at least partially avoided even though we don’t know what words and expressions will fall out of favour.
Maybe this habit is something I’ve picked up as a result of being a programmer. As explained, code is a form of communication; specifically it’s a form of communication wherein accuracy is prized over verbosity. Barring hardware failure (or other low-level failures), a computer will do exactly what it’s been programmed to do - that is - exactly what you told it to do.