Should companies invest in AI primarily to save costs? Counterintuitively, the answer is likely NO in the long term. While this may sound surprising, there’s solid reasoning behind it.
When your company implements AI solely to reduce costs, any competitive advantage is temporary. Why? Because competitors who adopt the technology later will:
This phenomenon was explored in the 1999 paper “The Effects of Moore’s Law and Slacking on Large Computations”, which demonstrated how technology laggards can ultimately benefit from delayed adoption when technology costs are continuously falling.
Instead of focusing exclusively on cost reduction, companies should leverage AI to improve their products and services. Here’s why:
The most strategic approach isn’t asking “How can we do the same things cheaper?” but rather “How can we use this technology to deliver something our customers couldn’t get before?”
Note: This perspective intentionally challenges conventional wisdom. While it assumes somewhat idealized market conditions (polypolistic markets with high transparency) and a strong correlation between Moore’s Law and AI costs, the core insight remains valuable for strategic decision-making.