The lack of software leadership is the driving force of verbose ceremonies and placating rituals

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com, or follow me on Twitter.

Good code communicates the essence of its task

Good code does not hide its task behind a slab of boilerplate code whose only reason for existing is to satisfy some complex syntax requirements of the language. Stuart Halloway said this beautifully:

Good code is the opposite of legacy code: it captures and communicates essence, while omitting ceremony (irrelevant detail). Capturing and communicating essence is hard; most of the code I have ever read fails to do this at even a basic level. But some code does a pretty good job with essence. Surprisingly, this decent code still is not very reusable. It can be reused, but only in painfully narrow contexts, and certainly not across a platform switch.

The reason for this is ceremony: code that is unrelated to the task at hand. This code is immediate deadweight, and often vastly outweighs the code that is actually getting work done. Many forms of ceremony come from unnecessary special cases or limitations at the language level, e.g.

factory patterns (Java)

dependency injection (Java)

getters and setters (Java)

annotations (Java)

verbose exception handling (Java)

special syntax for class variables (Ruby)

special syntax for instance variables (Ruby)

special syntax for the first block argument (Ruby)

High-ceremony code damns you twice: it is harder to maintain, and it needs more maintenance. When you are writing high-ceremony code, you are forced to commit to implementation approaches too early in the process, e.g. “Should I call new, go through a factory, or use dependency injection here?” Since you committed early, you are likely to be wrong and have to change your approach later. This is harder than it needs to be, since your code is bloated, and on it goes.

Let’s look at 2 examples of FizzBuzz.

Here is an example in Java:

public class FizzBuzz {
  public static void main(String[] args) {
    for(int i = 1; i <= 100; i++) {
      if (((i % 5) == 0) && ((i % 7) == 0))
      System.out.print("fizzbuzz");
      else if ((i % 5) == 0) System.out.print("fizz");
      else if ((i % 7) == 0) System.out.print("buzz");
      else System.out.print(i);
      System.out.print(" ");
    }
    System.out.println();
  }
}

And here is an example in Clojure:

(doseq [n (range 1 101)]
  (println
    (match [(mod n 3) (mod n 5)]
      [0 0] “FizzBuzz”
      [0 _] “Fizz”
      [_ 0] “Buzz”
      :else n)))

Which code is more direct to its task? Which code comes closer to communicating the essence of its task?

The great sadness of Object-Oriented Programming is the waste of brilliant minds

Incredible cleverness has been invested in trying to overcome the weaknesses of Object-Oriented Programming. For instance, Sandi Metz gave a talk at RailConf 2015, in which she gave an example of using Composition and Dependency Injection to solve a problem. She is aware that her approach involves a lot of bloat, and she even jokes about it in one of her slides:

“More code, Same behavior” is exactly what every developer hopes to avoid. She says, “We love Dependency Injection,” apparently unconcerned by the increase in complexity. She says the end result is “pluggable behavior,” which is certainly a good thing but which we should all hope to achieve in the simplest manner possible. She makes clear that what developers should want is this:

def order(data)
data.shuffle
end

But, in her example, what they end up with is this:

class RandomOrder
def order(data)
data.shuffle
end
end

These two extra lines of code,

class RandomOrder

end

are the penalty she pays for using Object-Oriented Programming. These extra two lines of code are a bit of useless ceremony. We know it is always better to have three lines of code, instead of five, for doing exactly the same thing; therefore we must ask, Wouldn't our lives be better if we got rid of all of the Object-Oriented cruft and simply had functions that could work on our data? Of course, this line of reasoning leads us to the conclusion that the Functional paradigm might be a better path forward than the Object-Oriented paradigm. In his talk, “Why Clojure is my favorite Ruby,” Steven Deobald refers to this as “Navigating disappointment.” (Check out Deobald's talk, “Why Clojure is my favorite Ruby”.) That is, many brilliant developers engage in tremendous cognitive struggles to navigate around the weaknesses of Object-Oriented languages.

Consider the overall lesson that Metz wants us to learn:

Is there a simpler way? Can we get "pluggable behavior" with less of a conceptual burden?

Ceremony can not replace leadership

In response to "avoid useless ceremony" some developers respond with the counter-argument "We need some ceremony to keep my team organized". An example from Hacker News:

I really liked using Sinatra until I used it with a team. The lack of ceremony and structure (as you put it) killed us. The app started out as just a simple service, so why not use Sinatra right? Then things changed and we were more focused on the web side, but stuck with Sinatra because everyone had heard the FUD about rails.

Each person has a preference on where they thought something should be. Days were spent arguing over the location of mundane things. Each team member had a different understanding of REST, so without the rails routing we argued about what we thought an ideal API would look like. We argued about which view engine we should introduce. We argued about how to handle our JS/CS. We argued some more about routes and how to make them more discoverable. On and on. Not intentional, but something new would come up and we would need to figure out where to put it.

This, more than anything, hammered home for me the strength of rails. Rails is great at getting you up and running, but it's greatest strength is that it's idioms are well documented. That gives developers a strong impression of how things are going to be and where things go. This saves you so much time by not having to argue and reach consensus on every little thing.

Does good leadership allow a situation where people people feel pressure "to argue and reach consensus on every little thing"? What if the team leader makes all of these decisions? Oh, but wait, that can't happen because then they would be a dictator, and that would be bad. Better rely on Rails. Hmm, but wait, who makes the decisions about Rails? David Heinemeier Hansson. Did he engage you in a conversation, and did he feel the need "to argue and reach consensus on every little thing"? No? So he was a dictator? Why is okay for him to be a dictator, but it is not okay for the leader of your team to be a dictator?

How to keep a team organized is a completely reasonably concern. But where is the leadership of your team if "Days were spent arguing over the location of mundane things"? If your team lacks leadership, then you might feel that impulse to outsource some of your decision making to an outside body, such as the team that creates Rails. If your internal politics are sick, then using a good framework allows you to borrow some healthy politics from someone else. But remember, that is a crutch. You and your project will be much healthier, and more creative, if you can work through the internal issues, or make changes in management, such that you end up with healthier leadership.

The crutch will only help you for a little while. Rails can help you get started, even if you have sick leadership, but after a few months you will find yourself having debates over "How to refactor fat models in Ruby On Rails". And then the leadership issues that you previously ducked will come back to haunt you. You can not permanently postpone the moment when you need to make hard decisions. And the best option, always, is to confront your leadership issues at the start of the project, rather than 8 months in.

Distrust of developers is the driving force of useless ceremony

In some sense, sick politics, at the national and even international level, has always been the driving force for useless ceremony. The direction that Java development took is perhaps the best example of this. For there was a political idea that drove Object Oriented Programming to its peak in the late 1990s: the idea of outsourcing. The idea of outsourcing software development rested on some assumptions about how software development should work, in particular the idea of the “genius” architect, backed by an army of morons who could act as secretaries, taking dictation. Ceremony Heavy Object Oriented Programming was the software equivalent of a trend that became common in manufacturing during the 1980s: design should stay in the USA while actual production should be sent to a 3rd World country. Working with UML diagrams, writing code could be reduced to mere grunt work, whereas the design of software could be handled by "visionaries", possessed with epic imaginations, who could specify an Object Oriented hierarchy which could then be sent to India for a vast team to actually type out. And the teams in India (or Vietnam, or Romania, etc) were never trusted, they were assumed to be idiots, and so, for a moment, there was a strong market demand for a language that treated programmers like idiots, and so the stage was set for the emergence of Java.

Facundoolano, who is a huge fan of Python, sums up the distrst that Java has for programmers:

Java design and libraries consistently make a huge effort making it difficult for a programmer to do bad things. If a feature was a potential means for a dumb programmer to make bad code, they would take away the feature altogether. And what’s worse, at times they even used it in the language implementation but prohibited the programmer to do so. Paul Graham remarkably pointed this out as a potential flaw of Java before actually trying the language:

Like the creators of sitcoms or junk food or package tours, Java’s designers were consciously designing a product for people not as smart as them.

But this approach is an illusion; no matter how hard you try, you can’t always keep bad programmers from writing bad programs. Java is a lousy replacement to the learning of algorithms and good programming practices.

The implications of this way of thinking are not limited to language features. The design decisions of the underlying language have a great effect on the programmer’s idiosyncrasy. It’s common to see Java designs and patterns that make an unnecessary effort in prohibiting potential bad uses of the resulting code, again not trusting the programmer’s judgment, wasting time and putting together very rigid structures.

The bottom line is that by making it hard for stupid programmers to do bad stuff, Java really gets in the way of smart programmers trying to make good programs.

The true path to pluggable behavior comes from avoiding needless specificity

Let's go back to the point that Sandi Metz was making. We want pluggable behavior. We want flexibility. We want code re-use. Is there an easier way to get it than the approach she suggested?

Consider Clojure's alternatives to standard Object Oriented approaches:

It ends up that classes in most OO programs fall into two distinct categories: those classes that are artifacts of the implementation/programming domain, e.g. String or collection classes, or Clojure’s reference types; and classes that represent application domain information, e.g. Employee, PurchaseOrder etc. It has always been an unfortunate characteristic of using classes for application domain information that it resulted in information being hidden behind class-specific micro-languages, e.g. even the seemingly harmless employee.getName() is a custom interface to data. Putting information in such classes is a problem, much like having every book being written in a different language would be a problem. You can no longer take a generic approach to information processing. This results in an explosion of needless specificity, and a dearth of reuse.

There are no perfect computer programming languages. There never will be. Every decision in programming involves making trade-offs between one approach versus another. But if there any guide that holds true in all circumstances, that last point might be best: avoid needless specificity. Avoid all the unnecessary cruft that ties you down to specific implementations. Avoid the needless rules that bind you to specific strategies without actually helping you.

And if this much freedom causes you to wonder "How can my team stay organized?" then you should address those leadership issues and not try to use needless ceremony as a crutch for avoiding a sick internal political situation.

Post external references

  1. 1
    http://thinkrelevance.com/blog/2008/04/01/ending-legacy-code-in-our-lifetime
  2. 2
    http://examples.oreilly.com/jenut/FizzBuzz.java
  3. 3
    https://github.com/clojure/core.match
  4. 4
    https://www.youtube.com/watch?v=OMPfEXIlTVE
  5. 5
    https://www.youtube.com/watch?v=PCdEbUBk6a0
  6. 6
    https://news.ycombinator.com/item?id=4516612
  7. 7
    https://www.google.com/search?q=how+to+refactor+fat+models+in+ruby+on+rails&oq=how+to+refactor+fat+models+in+ruby+on+rails&aqs=chrome..69i57.7887j0j7&sourceid=chrome&es_sm=91&ie=UTF-8
  8. 8
    https://facundoolano.wordpress.com/2011/11/03/python-doesn%E2%80%99t-treat-me-like-i%E2%80%99m-stupid/
  9. 9
    http://clojure.org/reference/datatypes
Source