AI

The Future of AI — part 1

AI is redrawing two old lines — "make or buy" and "software vs hardware." Software is getting faster to build; hardware is becoming strategic again.

Abstract illustration representing AI reshaping software and hardware boundaries.

AI is redrawing two old lines: “make or buy” and “software vs hardware.”

Software gets faster to build. Hardware becomes strategic again. Both shifts are happening at once, and both are changing how companies should think about where to invest.

Is it changing the “make or buy” decision?

AI tools are making software teams around 30% more productive across the whole development cycle — from design to coding, testing, and maintenance.

That doesn’t mean fewer developers. It means the break-even point for building something in-house has moved. Things that were not worth building before are worth building now. The “buy a SaaS” reflex, which made sense for a decade because custom software was expensive, no longer applies automatically.

Three implications follow:

  1. Customization is cheaper. Generic SaaS products still solve generic problems, but the competitive advantage of tailored internal tools has gone up, not down.
  2. Integration, not invention, becomes the bottleneck. Anyone can now produce a reasonable first version of something. Making it work inside a real organization — with its data, permissions, and legacy systems — is the hard part.
  3. Vendor lock-in gets more expensive. When it was hard to switch, paying for a SaaS was rational. When it’s cheaper to build or replace, the switching cost baked into some vendor relationships looks a lot more negotiable.

Software vs hardware

For twenty years, “software is eating the world” was the organizing idea. Value moved up the stack. Hardware became commodity plumbing under it.

AI changes that. The models that matter most right now are constrained by what silicon can do, what networks can move, and how much power you can pull through a rack. That means:

  • Chips are strategic again. Not just as investments, but as national capability.
  • Data center physics matters. Cooling, interconnect bandwidth, and power are now directly in the critical path of what products are possible.
  • Edge hardware is back on the roadmap. A lot of AI work will happen close to where the data is — sensors, devices, cars — because moving everything to a central cloud is too slow and too expensive.

What I’m watching

I’ll use this series to think out loud about a few specific shifts:

  • How engineering teams are actually reorganizing around AI tooling, not just what vendors claim.
  • Which parts of “the cloud” are genuinely commoditized and which are becoming new forms of lock-in.
  • Where hardware decisions that used to be technical have become strategic.

More in part 2.

digitaltransformationhardwareinnovationsoftwareengineeringtechtrends

Discussion

Comments for this post live off-site so the conversation stays where readers already are.