December 3rd, 2014
(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: firstname.lastname@example.org
I made a bug once, and I need to tell you about it. So, in 2001, I wrote a reference library for JSON, in Java, and in it, I had this line
private int index
that created a variable called “index” which counted the number of characters in the JSON text that we were parsing, and it was used to produce an error message.
Last year, I got a bug report from somebody. It turns out that they had a JSON text which was several gigabytes in size, and they had a syntax error past two gigabytes, and my JSON library did not properly report where the error was — it was off by two gigabytes, which, that’s kind of a big error, isn’t it? And the reason was, I used an int.
Now, I can justify my choice in doing that. At the time that I did it, two gigabytes was a really big disk drive, and my use of JSON still is very small messages. My JSON messages are rarely bigger than a couple of K. And — a couple gigs, yeah that’s about a thousand times bigger than I need, I should be all right. No, turns out it wasn’t enough.
You might think well, one bug in 12 years you’re doing pretty good. And I’m saying no, that’s not good enough. I want my programs to be perfect. I don’t want anything to go wrong. And in this case it went wrong simply because *Java gave me a choice that I didn’t need, and I made the wrong choice*.