Enterprise technology commentators have been dreaming of a “serverless” business environment for years. A cloud-native setup offers dramatic apparent benefits in terms of developer productivity, scalability, and cost-management—all with less lock-in to one provider or hardware-maker.
Until now, however, this idea was more conjecture than reality. Whatever the benefits of hosting some services in the cloud, core security, development, and administration concerns led many enterprises to keep core computing and data infrastructure on premise.
Today, the right mix of cloud native application architecture, a willingness to adapt existing ways of doing business, and “best of both world” hybrid solutions are finally paving the way toward a marketplace where unduly costly physical infrastructure is a thing of the past.
Liberating Organizations from Infrastructure Management
No computing, of course, is truly “serverless.” Rather, this term illustrates that no server is maintained by the end-user organization. Rather, they consume abstracted technology services hosted by a third party server.
Practical concerns meant the business world came to accept that accessing the benefits of technology meant every enterprise had to own and operate its own mini server business unit. Taken in broader historical context however, owning and operating complex IT infrastructure was always a costly, cumbersome sideshow for companies in most industries. By moving away from managing a small server operation, enterprises not only do away with an unnecessary challenge, but free up technology pros to focus on ground-level operations issues rather than server administration.
This arrangement is not only more cost-effective but enables access to powerful technology clusters like machine learning that would have been out of reach for most physical deployments in the first place.
Abstracting to Unprecedented Flexibility
The architecture underlying enterprise service provision is changing to become a key enabler for a broader move toward serverless infrastructure. Initial moves toward virtualization centered on abstracting discrete hardware units— “Virtual Machines” managed via hypervisor.
Today’s serverless architectures virtualize not only hardware, but OS and the runtime itself, scaling out a function-based level. This further abstraction reduces the managerial sprawl of first-generation virtualization techniques, allowing individual servers to be sourced differently depending on management imperatives pertaining to cost, performance, vendor-lock in, and level of administrative control.
While true a true server-less model enables unmatched simplicity of management, scalability, and agility, these solutions still have relatively limited monitoring, debugging, and ecosystem support capabilities. As this model continues to mature, more enterprise services can go serverless.
For now, organizations can choose strategically based on their own strategic priorities. Flexible microservices and demanding batch processing tasks can go serverless, for instance, even as core operational systems stay on an in-house server and the development environment goes to a container-based approach.
A Practical Path Forward
The precise path forward will shift as technologies continue to mature. One thing is certain, however: the cost savings of cloud-driven innovation are not automatic, requiring specialized operational know how to be efficiently implemented in any specific business context.
Every organization should begin training and consulting arrangements to provide for adequate knowledge for the transition to IaaS/PaaS/serverless environments. Time and resources invested upfront—ideally including prototyping efforts—will result in cost savings down the road. Developing a true cloud-native team is a strategic decision bound to pay dividends in the long run.
The benefits of serverless enterprise computing are simply too substantial to ignore. The trick is building the knowledgebase necessary to pave the way there. While the tools for a serverless environment are still being fleshed out—requiring a pragmatic, hybrid approach in most cases—they’re rapidly maturing. We expect serverless use cases to increasingly proliferate, and organizations who haven’t prepared the resources needed to keep up will find themselves scrambling.