Disclaimer: All views are personal and do not reflect any position of organisations that I am associated with professionally or voluntarily
For long the technology and software related works have been fascinated with the Moore’s law. With the spurt of more and more engineers and graduates alike having made software as a way of life, Moore’s law became a commonplace concept.
For the uninitiated, below is the detail of the Moore’s Law –
Moore’s Law states that the number of transistors on a microchip doubles about every two years, though the cost of computers is halved. In 1965, Gordon E. Moore, the co-founder of Intel, made this observation that became Moore’s Law. Another tenet of Moore’s Law says that the growth of microprocessors is exponential. (Source here)
With growth in Intel’s popularity, Moore’s Law also gained significant prominence, popularity and acceptance. Just that many a times acceptance was overfitting the concept into the observation.
The fundamental construct of Moore’s Law relies on the ‘time-factor’. As a result, with more advancement of technology, the time factor needs modification as technology advancement and disruptions continue to take place. The current accepted time factor is 18 months.
A contemporary and equally compelling and thought provoking concept is the Wright’s law.
What is Wright’s Law?
While studying the airplane manufacturing, Theodore Paul Wright, postulated that ‘for every doubling of airplane production the labor requirement was reduced by 10-15%.’ (Source here).
The fundamental premise of Wright’s law is that ‘we learn by doing’.
This seems to make a lot of sense and has been the foundational premise for many a thriving businesses where a productivity target is kept in sight every year. The cost factors for just getting the job done are on a constant decline. This is the reason why there is a constant call for automation and to do more with less.
The fundamental difference between the two laws is that –
‘Moore’s Law focuses on cost of production as a function of time while Wright’s Law focuses on cost of production as a function of the number of units produced.’
IEEE went ahead and published a detailed study of comparing the following –
- Moore’s Law,
- Wright’s Law,
- Goddard’s Law (economies of scale),
- Nordhaus Synthesis (Time and experience) and
- Sinclair, Klepper and Cohen’s Synthesis (Experience and Scale).
The study also found that both Moore’s Law and Wright’s Law are applicable when the production reaches exponential growth. In that, Wright’s law gives better prediction than Moore’s law.
With the growing production of AI models and analytics products and solutions, the world of artificial intelligence and data engineering is reaching exponential growth levels very soon. While Moore’s law is no longer able to explain the growth in a temporal fashion, the Wright’s Law’s application should certainly lead to reduced costs for producing these models.
The plethora of online courses for AI and ML are helping in building general awareness of the concepts. Broader and more abundant talent supply will surely bring the cost of talent acquisition (and retention) and so the cost of production down.
This should be the Wright’s Law in play for this sector.