November 2nd, 2014
(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: firstname.lastname@example.org
I use asserts a great deal in Clojure. They partly take the place of unit tests. Apparently they are mostly unused in the world of Python, although some of the reasons listed here would be general to any language:
Several reasons come to mind…
It is not a primary function
Many programmers, lets not get bogged down by the rationale, disrespect anything which is not a direct participant in the program’s penultimate functionality. The assert statement is intended for debugging and testing, and so, a luxury they can ill-afford.
The assert statement predates the rise and rise of unit-testing. Whilst the assert statement still has its uses, unit-testing is now widely used for constructing a hostile environment with which to bash the crap out of a subroutine and its system. Under these conditions assert statements start to feel like knives in a gunfight.
Improved industry respect for testing
The assert statement serves best as the last line of defence. It rose to lofty and untouchable heights under the C language, when that language ruled the world, as a great way to implement the new-fangled “defensive programming”; it recognises and traps catastrophic disasters in the moment they teeter on the brink. This was before the value of Testing became widely recognised and respected and disasters were substantially more common.
Today, it is unheard of, for any serious commercial software to be released without some form of testing. Testing is taken seriously and has evolved into a massive field. There are Testing professionals and Quality Assurance departments with big checklists and formal sign-offs. Under these conditions programmers tend not to bother with asserts because they have confidence that their code will be subjected to so much tiresome testing that the odds of wacky brink-of-disaster conditions are so remote as to be negligible. That’s not to say they’re right, but if the blame for lazy programming can be shifted to the QA department, hell why not?
These reasons are more specific to Python, but it is curious why the “optimized mode” is not used more commonly:
I guess the main reason for assert not being used more often is that nobody uses Python’s “optimized” mode.
Asserts are a great tool to detect programming mistakes, to guard yourself from unexpected situations, but all this error checking comes with a cost. In compiled languages such as C/C++, this does not really matter, since asserts are only enabled in debug mode.
In Python, on the other hand, there is no strict distinction between debug and release mode. The interpreter features an “optimization flag” (-O), but currently this does not actually optimize the byte code, but only removes asserts.
Therefore, most Python users just ignore the -O flag and run their scripts in “normal mode”, which is kind of the debug mode since asserts are enabled and __debug__ is True, but is considered “production ready”.
Maybe it would be wiser to switch the logic, i.e., “optimize” by default and only enable asserts in an explicit debug mode, but I guess this would confuse a lot of users and I doubt we will ever see such a change.
This submission describes programming by contract for Python. Eiffel’s Design By Contract(tm) is perhaps the most popular use of programming contracts .
Programming contracts extends the language to include invariant expressions for classes and modules, and pre- and post-condition expressions for functions and methods.
These expressions (contracts) are similar to assertions: they must be true or the program is stopped, and run-time checking of the contracts is typically only enabled while debugging. Contracts are higher-level than straight assertions and are typically included in documentation.
Python already has assertions, why add extra stuff to the language to support something like contracts? The two best reasons are 1) better, more accurate documentation, and 2) easier testing.
Complex modules and classes never seem to be documented quite right. The documentation provided may be enough to convince a programmer to use a particular module or class over another, but the programmer almost always has to read the source code when the real debugging starts.
Contracts extend the excellent example provided by the doctest module . Documentation is readable by programmers, yet has executable tests embedded in it.
Testing code with contracts is easier too. Comprehensive contracts are equivalent to unit tests . Tests exercise the full range of pre-conditions, and fail if the post-conditions are triggered. Theoretically, a correctly specified function can be tested completely randomly.
So why add this to the language? Why not have several different implementations, or let programmers implement their own assertions? The answer is the behavior of contracts under inheritance.
Suppose Alice and Bob use different assertions packages. If Alice produces a class library protected by assertions, Bob cannot derive classes from Alice’s library and expect proper checking of post-conditions and invariants. If they both use the same assertions package, then Bob can override Alice’s methods yet still test against Alice’s contract assertions. The natural place to find this assertions system is in the language’s run-time library.
This is an insightful phrase: “behavior of contracts under inheritance”. There is some cost to being a multi-paradigm language, and it shows in a situation like this, where 2 desirable goals are in conflict with each other. Ordinarily, faced with the need for a new control structure, Lisp programmers would say “That is what macros are for.” And, indeed, in a Lisp such as Clojure, there are multiple packages for enforcing contracts, and you can use the packages together without much worry of conflicts. But Clojure does not have to worry about inheritance. It really is the combination of “contracts under inheritance” that creates the need for this functionality to be in the core of the language.
I notice that PEP was introduced in 2003 and has gone nowhere since. I guess this idea is not popular in the Python community? I wonder why.Source