Not long ago, IT standardization was en vogue as a means of reducing mean time to repair (MTTR) and support costs, while improving productivity of the user. Then along comes Bring Your Own Device (BYOD), and standardization as we know it is out the window in favor of supporting anything and everything walking in the door of the enterprise. This movement has enabled the enterprise to reduce asset costs, but it has driven up complexity and has challenged the very core of IT as it tries to provide essential support services to its installed base of users.
This shift is all part of a new approach to computing that many are calling the third era. How we did get to this point?
The first era revolved around the mainframe computing system, where all processing was centralized and controlled by a group called management information systems, or MIS (the forerunners of today’s IT group). If you needed to crunch numbers and data and report on business operations, this is where you went. If you needed to create or modify a report, you pleaded with their leadership and in several weeks you may have gotten the report you needed.
The second era was the all-important client-server era, which came as a result of many people personally buying and bringing in Apple II or IBM PCs into the business to deliver business information in a timely fashion – much to the chagrin of the MIS folks. This era created hundreds of powerful tech companies, and ushered in the dawn of personal computing. This later led to the birth of the World Wide Web in the ‘90s and introduced millions to a new form of information gathering – taking client-server architecture further than anyone expected.
The third era of computing is part of the dynamic shift that occurred beginning in 2010. It’s a functional architecture that defies the structures or control points of the prior era. It’s more open, distributed and virtual than client-server could ever imagine.
The third era is also characterized by a corporate mindset shift in investing in and owning not only software, but also infrastructure hardware. When we began emerging from the economic depression beginning in 2008, business leaders and CIOs questioned the validity and investment costs of modernizing their data centers or moving some or most of their workloads to the cloud. They found that software and hardware could be rented and consumed as needed, versus an ownership structure. Leveraging cloud technologies, massive amounts of computing resources could be rented for a fraction of the cost of buying and support this infrastructure.
When you extract the revenue from what Gartner calls the Nexus of Forces – Cloud, Mobile, Social and Information – it’s very apparent that the IT industry needed to change. Single-digit growth rates for more than a dozen years doesn’t excite anyone.
In my next post, I’ll talk about the rise of autonomics and cognitive computing. In the meantime, I’d love to hear your comments, questions and reminiscences from our computing past.