OOP and "Modeling the Real World"

Updated: 4/16/2002


Note: This article originally was part of OOP Criticism, but has since been moved to its own page due to the growth of the original article.

It is often claimed by OOP proponents that OOP "better models the real world", and therefore produces more natural, easy-to-use applications. This is nonsense. Even Bertrand Meyer agrees that OO does not reflect the real world more than it's competitors. (He questions the wisdom of trying to model the real world, to be more precise.)

The term "object" implies that OOP closely matches things in the real world such as cars, customers, invoices, etc. It is true that OOP tends to group things around "nouns", while procedural tends to group things around verbs, or "tasks" to be more precise. However, the "noun" component is often in the database in well-designed procedural systems. In other words, descriptions of the nouns in a procedural/relational system tend to be in databases, and not in code.

In p/r, the associations among the nouns also tend to be managed via the database and relational algebra, and not in the code. This difference shows up, for example, in GOF-like patterns, where the OOP versions use code structures, while the p/r versions tends to use table relations. One is more likely to study the table schemas, ER diagrams, and relational queries to ascertain how a p/r system models, while in OOP one is more likely to use the program code structure itself as the modeling mechanism. They both may model the same things, more or less, just in different places in different ways.

I personally find the relational-centric approach more change-friendly and overall superior, but this section is about fitting the real world rather than making code more change-friendly. These two goals may or may not be related. Whether better real-world modeling equates to being more change-friendly makes for interesting philosophical discussions.

Whether OOP is "noun-centric" or not is somewhat debatable. Some OO fans will generally agree, while others reject it. Thus, I am using my working definition.

One of the biggest difference is not the mere presence or absence of nouns or verbs, but the associations between nouns and verbs. In the real world, associations between behavior and physical objects is fairly weak and highly dynamic.

In the real world, most "objects" can be acted upon by a wide variety of behaviors (from other objects). This includes earthquakes, floods, people, pets, vacuum cleaners, wind, flying golf balls, cars, new government regulations, changes in marketing strategy, etc. Real world objects do not have a list of predetermined "proper" behaviors associated with them, which is essentially what OOP does.

OOP was born in the domain of physical modeling, where one strives to make nouns "self-handling" more or less. Whether "modern OOP" is supposed to closely follow this view is hotly debated among OOP proponents.
If the real world was like OOP, then we would get scenarios like, "I am sorry Mr. Mugger, but I have no victim.mug() method. You will just have to try somebody else." The mugger could call a Mugger.Mug(victim) method, but the relationship between the mugger and victim still has to be built or explicitly referenced between the two in advance. (The fact that it can be under either one is a bit unsettling. This suggests Continuity Problems if one needs to shuffle between them.)

Unlike the real world, OOP places strict limits about what can do what to what. (Inheritance allows larger sets of behaviors to be included on an object's "behavior list", but there is still a limit to what fits a tree pattern and inheritance usually cannot be done ad-hoc in a practical way.) OOP tends to be pickier about behavior association than other paradigms. This makes it even further removed from the real world. (My favorite approach is to keep associations virtual and local using Boolean and relational algebra, while OOP tends to make them global, which is limiting and distracting in my opinion.)

   x = a + b
Even a simple math equation exposes the artificial or forced association of behavior to one-and-only-one noun (one class per method). The plus operation (+) must "belong" to either a or b in most OOP. I don't know about you, but I view "plus" as a relatively free-floating operator that takes two operands. Similarly, "mug" (above) is an operator with two operands, the mugger and the victim. (This may suggest that infix notation is superior to prefix; however, it is a separate topic.)

Finding a single association is even messier under:

  x = a + b + c + d
(More on verbs and nouns will be presented later.)

It must be noted that people do perceive the world differently, as described here. I agree that OOP may model some people's heads well. Just not mine. I can only make a judgement based on how I perceive the world and language.

Some argue that computer applications tend to deal with a relatively narrow subset of behaviors per object, and thus the strict behavior association is not a hindrance to building good OOP applications, and improves compile-time checking. This may or may not be true, but it still follows that OOP does not better model the real world than other paradigms. If anything, it is a worse model of the real world in this regard. The real world does not have compile-time checking for the most part. (Perhaps if we could have compiled Enron early on, they may have had less problems.)

Many computer modeling approaches bring about abstractions that are not one-to-one with real-world objects. For example, one is encouraged to reference an address type (street number, city, etc.) to avoid repeating the address structure for different real world object kinds. This may be considered higher abstraction (or higher factoring), but is not a better fit to the real world. This is because the real world has a certain amount of redundancy and repetition that may be seen as counterproductive to a computer application.

Such factoring is not inherent to OOP. RDBMS can also have a single "Address" or "Contact" table that is referenced by other entities if desired. I am not saying here whether this practice is bad or good, only saying that it deviates from the real world. One could even argue that I should not mention it since OOP does not factor better/more than other paradigms. However, some do claim that OOP factors better.

Finally, inheritance is rarely applicable to the way real world collections (data) and structures change over time. Real world changes rarely care about sticking to an artificial inheritance hierarchy in my experience. Unless there is some "tree cop" in the domain to make changes fit a tree pattern, hierarchies are artificial constructs. True, sometimes the customer is already using tree taxonomies, but whether this is the best approach or not is debatable. Do OO fans claim that OOP better models bad practices? Besides, trees that a customer is using can still be modeled as data tables if needed. I don't know about you, but I never hard-wire product classifications into code.

Simulating The Real World

I disagree with the suggestion that business processes should be heavily modeled as real-world interactions. Viewing business applications as "simulations" can be problematic. Simulations are to reflect interactions of the "real world" in order to study the real world and improve its flow. Business applications are to achieve something by the best means possible using computers. These two goals are not necessarily the same, nor necessarily result in the same solution. A common example given of this disconnect is that if flight was modeled to reflect our actual real-world experiences, then airplanes would have wings that flap.

Mirroring the real world and getting something done as efficiently as possible are different animals. Sometimes they overlap, but often they don't. The strength and weaknesses of computers are different than those of humans. Thus, to achieve the same task as simple and flexible as possible via computer requires different approaches than achieving it as a human with desks, paper, elevators, etc.

I don't really question OO's value in modeling interactions and behaviors of the real world. However, the best techniques for modeling the real world and for making better software are probably not the same in most cases.

I also notice that some software developers try to mirror the "real world" very closely in order to keep the customer comfortable by keeping alive archaic processes from the manual way of doing things. However, one may miss opportunities to improve or streamline the process if this is done. I am not saying that initial customer comfort is a bad thing, but perhaps the customer should be aware of the tradeoff being provided. Just be careful not offend them by implying that they won't be able to "handle the ideal process". These kinds of things often take delicate diplomatic skills that frankly sometimes exceed my abilities.

Keep in mind, though, that the issue of the internal model (what the developer sees) and the external model (user interface) are generally independent. One can reflect or simulate the external world without having the other do the same.

OOP Better Models Sentences?

I never considered it a very important indicator whether a language or paradigm better matches verbal (spoken) language or not. However, to some OOP fans it seems important. Some have claimed that OO's noun-centricity is either a better modeling construct and/or a better fit to natural languages.

In English, the verb is actually the central construct. A sentence can have only one central verb or verb group, but there is no "central noun". The object and subject are generally considered peers. Thus, Verb Is King if we look at English. (At least it is not ranked lower.) Most non-trivial sentences have an object, verb, and a subject. This resembles the "operand" syntax, such as that found above in the "plus" example.

  add(a, b, c, d, e)
  // same as: a + b + c + d + e
  concat(a, b, c, d, e)
  // same as:  a . b . c . d . e
  // ("." is concatenation in this case)
These kinds of operations are more verb-centric than noun-centric. They have one verb (or action) with multiple nouns. The operator is not associated with one and only one operand, as OOP would have it. This gives one the potential of a + b giving a different answer under OOP than b + a if there is some conversion taking place, because b may handle the conversion differently than a.

To me, task-centricity reflects how the real world changes. The task remains more or less the same, but the "players" will change. The task is more invariant than the participants. Thus, grouping behavior around nouns instead of tasks will more likely require disruptive changes. See The Noun Shuffle for more on this.


See Also:
Modeling Business
Aspects
Abstraction


OOP Criticism