Below is an attempt to catalog typical patterns or claims used by my opponents to distract attention from the real topics or to intimidate me. Although they are mostly based on debates with the "IWETHEY" crowd, they often apply to other groups as well.
However, my opponents are not representative of all developers and perhaps not even representative of IWE. I tend to step on favorite fads and trends, and this ticks off a lot of people because they want their favorite trend to continue. Do you think that the Bee Gees wanted disco to end?
To be fair, one of the reasons why I bash OOP is that it ruined some of my favorite trends and paradigm progressions. In short, it is a zero-sum game. OOP has it's niches, but it has been overextended due to "get modern" marketing campaigns.
Most programmers I know of are ambivalent toward OOP, but will use it to fit in. They play the game the same way many women play the fashion game of trying to beat their competition to the latest fashion craze. It is not whether the current fad is practical or logical, it is whether they can "play" it better. Someone once called this "surfing a fad." You don't fight the wave, you ride it. Well, I am fighting it because I don't like bell-bOOttoms.
In actual usage, many die-hard OO fans complain that most OOP shops are still using mostly procedural designs that are jammed into classes. Classes are mapped 1-to-1 to RDBMS tables via set/gets (trading a table handle for an object, which is very similar conceptually), and an OOP GUI framework is used. Beyond these two half-OO attempts, most code even in OOP languages is still very procedural in nature.
Thus the world is not turning to OO from procedural, but is simply doing mostly procedural programming in OOP languages and API's. The languages and API's are changing, but not the actual techniques that programmers use.
There is even some suggestion in the OOP community that even new programmers "slide back into procedural thinking". It just may be that procedural is the default mode of most programmers, even new ones. (However, there are no decent surveys to confirm this so far.)
Thus, OO fans will find aspects of OO lore, often definitions, that a skeptic does not know, and intimidates them into silence. "How can you criticize OO if you don't know what the Visitor Pattern is?" Also see "Trivia Traps" below. The green debater cannot know if the cited term is important or not without running off and studying the topic mentioned. (It is a great way to get somebody off your back, but it only works about 3 times before you are discredited as a Red Herring launcher.)
The bottom line is that it is a base philosophical difference and that the trivia is not the key to many paradigm preferences.
Also, one can often get by just fine without fully buying into OO (see above). Thus, there is no benefit in admitting a dislike.
For example, I have presented examples of OO code that used at least one of polymorphism, encapsulation, or inheritance. Yet Jay would claim that my example was "not OOP." I asked him to define a clearer criteria, but he declined. Somebody else claimed that "composition" was a central concept to OOP. Yet, I have never seen an introduction to OOP that mentioned composition. If it was so important, then it should at least be mentioned in such texts. It would be like a book called "Introduction to Christianity" that did not mention the Crucifixion. Frankly, composition is just as doable in procedural-relational coding, and is often handled automatically by the database engine instead of "hand-indexing" by the programmer.
They mutilate definitions and prerequisites in order to falsely claim that I don't have a general understand of something. They also mistake language-specific trivia for "understanding".
Plus, everybody and their dog have a different version of what "key OO" is really about. Lack of consistency is not a bragging point, guys.
I have posted thousands of messages to forums and message boards of various types. Because I am human, sometimes I make a silly mistake. This would not be a problem, except that my opponents tend to have wonderful memories for these mistakes (and often little else).
Whenever they start to run out of steam under the current topic, they slip in a reference to a past mistake in an effort to paint me as always faulty. Because the reader often can not see all the planes that land safely, they assume that my jets are more faulty than average.
Complex languages and concepts are rarely all good or all bad. I like to point out both the good and the bad. That is how things are improved on. Just because I criticize some aspect of something, does not mean that I am dismissing the entire thing.
An example of this was during a perl debate where I claimed perl was too symbol-happy and abusable in the hands of bored programmers. A side issue popped up in which I made a mistake about local versus global variables. The mistake had nothing to do with the primary issue of symbols and abusability, yet perl-lovers kept shoving my mistake in my face saying, "you don't know anything about perl, so how can you talk." They would not address the symbol and abusability issue at face value, instead attacking me personally.
The simplest form of Trivia Trapping involves insults on my spelling or grammar. My favorite is "sloppy grammar is a sign of a sloppy mind." I would like to see the psychological correlation studies for such fortune cookie insults.
When they are unable to produce them, here are typical excuses:
You would not understand them even if I did make one.
Perhaps I might ask a few questions about it, but so what. Just answer the questions and get on with it. I don't know every command of every language and never claimed that I did. Perhaps we can set a question quota for the contest. However, the number of questions in the past have not been that great.
You will claim it is outside of the target niche.
The target niche is small to medium custom business applications. If somebody introduces a pattern that they claim is common in business, then they, or some other OO business programmer should be able to describe the situations where it occurred. (After all, the pattern has to be somewhat common to qualify. Something rare does not usually justify added language constructs just to handle such blue moons.)
Example: "I encountered the driver pattern in payroll for automatic deposits. We had to create a standard protocol by which our deposit equipment suppliers would write drivers for this (our) protocol in the same language, and we would plug them all in. I also encountered it in inventory when each supplier had to supply us with a special bar-code reader gun. They built their guns around our specified protocol." (This example is purely fictitious, BTW.)
So, stop making excuses and produce examples and scenarios or admit OO's failure for the stated niche.
Note: it appears that many OO fans tend not to work in the given niche. This may indicate something.
The second variation of Missing References usually follows the "you always x" pattern. This is where a blanket statement is made without specific references. Sometimes even when references are given, the Plane Crash tactic is used. This will give the claims enough credibility in my opponents' minds for them to accept claims without a fuller statistical account.
For example, someone might claim that "your house always smells". Another person may then provide dates and times which happen to correspond to times just after I "took a dump." Whether such times are representative or not of all hours of the day does not matter. Proponents just need enough "fuel" to be able to dismiss my attacks on their sacred cows.
On the other hand, if I answer "get what?", then I appear naive to an uninformed reader. (Malraux rarely provides any real technical content. He is mostly just a heckle-bot that repeats the same Empty Insults over and over again.)
Another hole in elitism relates to the concept I call "Population fit." Even if their favorite technology or paradigm only worked well for the mentally elite (only an allegation at this point), it still may not be a good fit for the larger population of non-elite, and perhaps deserves a much smaller niche. If pointed out, this debate usually ends up going like this:
A: The market should focus on making the average programmer better, not the best better.
B: But employers should only hire the best then.
A. That is hard to do. How are they going to know who is the best?
B. If they don't take the time to screen properly, they deserve what they get.
A. But you are thinking on a micro-level, not a macro level.
B. So!
It is almost as if they should be rewarded for being (allegedly) superior by having the common paradigms or technologies favor them. This thinking is purely selfish and arrogant. The goal should be to increase the overall productivity and reliability of programming efforts overall, not reward a select few who are able to master a niche paradigm or technology.
(Note that in my opinion, paradigms are subjective. What works well for one developer or organizational culture may not work well for another. Thus, it is not necessarily a matter of good programmers versus bad ones, but that some programmers are going to fit current fads better than others. As the fads change, so too will the relative advantages. In other words, today's "elite" might be tomorrow's so-so's.)
Some claim that I have the burden of proof to show that procedural/relational is better than OO. However, this is not really my claim. I may suggest problem spots that I personally find with OO, but this is just bonus information.
Let's look at the possibilities:
1. A is better than B
2. The benefits between A and B are unknown
3. The benefits between A and B are the same
4. B is better than A
Stating that OO proponents have no proof of #1 is NOT the same as claiming #4. Perhaps if there were only two possibilities, then they might be right: I would indeed have the burden of proof. However, not when there are 4. I am simply saying that there is insufficient information to claim #1 (where A=OO and B=p/r). I make no official claim as to whether the real answer is 2, 3, or 4. (I perhaps should add another possibility: the differences are subjective.)
The main gist of my argument is that OO is over-hyped and over-sold and that it does not deserve all the research and vendor attention until there is proof it is better.
Those that make such a claim rarely even try to present proof/demonstrations. I will often say things like, "show me your best evidence and I will revisit/look a it". They rarely comply. I will count every keystroke and eye movement if necessary to show my point of view (with things like maintenance scenarios), but the other side usually degenerates into authoritarian pre-science: "If you understood OO as well as I did, you would simply FEEL how it is better and we would not need to count keystrokes and eye movements."
Finally, I admit that OO may be better for some niches. I would not even admit that if I was truly rabid. (I try to limit my criticisms to domains I am familiar with.)
Further, my tentative conclusion is that the differences are probably subjective. Being that clear, sound, and inspectable evidence has never been presented, the subjectivity view is a very reasonable tentative assessment of the benefits.
Some OO proponents claim that OO mostly or only shines under big projects. Thus, they suggest that they are unable to demonstrate its benefits without a huge, expensive code review.
However, opinions on this vary widely among OO proponents. See OOSC2 Critique (page 665) for more about this. Further, I think the task-orientation of p/r tends to scale better because tasks stay relatively stable as a project scales. Nouns are more likely to split and change than tasks as a project grows in my experience.
If I am rejecting the OOP hype because I simply like the old ways, then why have I not rejected other newly-popular technologies over the years? Examples include:
More to come . . .