The New Lines of Code Equation

August 9, 2012

In yesteryear, in the time of our coding forefathers, ancient project managers would gauge a developer’s productivity my measuring the lines of code added per unit time. This was a convenient and intuitive metric, as there was seemingly a correlation between number of lines coded and number of features added. When companies then started rewarding developers according to this metric, it lead to a rash of bad programming habits – most notably copy and paste coding. In modern times, we consider ourselves enlightened, and claim to have realized that not only is there no direct correlation, but that attempting to measure developer productivity in this way is both meaningless and destructive. Yet, if you corner the average development manager and ask them to compare developer productivity, many of them would not be able to resist the urge to pull metrics from source control. This due to a lack of awareness of a complete reversal in the way in which lines of code reflect productivity.

The programming practices of the modern software engineer are amazingly diverse. They include Object Oriented Design, Functional Programming, Design Patterns, System Architecture, Software Frameworks, and Test Driven Development. Each of these practices all have the same goal: To reduce the total number of lines of code. Some would argue in the short term, several of these practices will temporarily increase the number of lines of code e.g. a Strategy pattern where a switch might do, or a Domain Model where a God Class would suffice. However, in the long term, each result in a net reduction of lines of code thanks to a judicious application of the DRY principle. Learning the proper application of these diverse, complex, and at times paradoxical practices is a career long pursuit, and is a skill set typically only found in the most experienced of engineers. To say it another way, the very best engineers are working their hardest to reduce the total number of lines of code.

Think of the number of lines of code like straws in the proverbial haystack in which one seeks a needle. The primary measure of developer productivity is their ability to rapidly locate needles in the haystack. The more straws, the harder it is to find the needle. “Finding the needle”, in this context, is what a developer does when they program: They scan through many thousands of lines of code looking for the proper spot to change or enhance in order to incorporate a new business requirement. The secondary measure of developer productivity is their ability to apply the change or enhancement rapidly with little fear or breaking the system as a whole. Both measures are dramatically impacted by the number of lines of code, with productivity spiking upwards sharply in small code bases, and plummeting when the code base becomes massive. This is the #1 reason developer’s overwhelmingly prefer working in green field projects rather than in legacy systems. What then is the proper method for gauging a developers productivity? Simply put: Business value added per unit time. Agile methodologies provide a means to measure this type of productivity in terms of the team’s velocity, which is derived from user stories (business value) completed per iteration (unit time). If you cannot gauge productivity from velocity, and instead feel compelled to glance at the number of lines of code, there is strong possibility that you are doing Agile all wrong.


I would like to point out that if we work together today, or have in the past, my opinions may or may not have been influenced by working with you. Most likely they have been, but I have to say that to avoid offending people. You're so vain. I bet you think this site is about you.