Menu
7 ways the cloud is changing

7 ways the cloud is changing

Computing without servers? Programming without code? Let’s ponder these and other innovations in the constantly evolving cloud.

Credit: Dreamstime

The word “cloud” is one of those nebulous words that people deploy with steadfast conviction. When politicians get in a bind, they like to say that the solution to a problem is more “education.”

Doctors toss off the word “rest.” And for the last decade or more, everyone in the IT shop from the intern to the CIO knows that the right answer to any question is “the cloud.”

The word, though, hides a great deal of complexity and confusion because it has meant so many different things over the years. Before the buzzword was even adopted, mainframe companies sold timeshare systems. The first early colocation companies rented accounts on Unix servers. Companies like America Online offered cloud-like storage and computing services under the guise of entertainment.

This evolution has continued even after the word was crowned. The earliest offerings were just dozens of thin virtual machines running on fat servers. You could pretend to be root even though your instance was a tiny fraction of the machine. Then FTP servers were rebranded as buckets and dropboxes. Since then, services have exploded and the letters “AAS” have become the favourite suffix for acronym creators.

The evolution continues and every new development prompts cloud users to rethink what they want to rent and what they expect to get for their money.

Many of these are rediscovered ideas that have been repackaged and rehyped. Many are clever solutions that solve the problems created by the last generation of solutions. All of them give us a chance to look at what we’re building and, in the words of the poet, “make it new again.”

Here are seven important ways the cloud is innovating and evolving:

1 - Money-saving granularity

The first generation of serverless computing came with relatively large units of computation. AWS Lambda, for instance, began by rounding up all calls to 100ms. Programmers quickly learned that they were paying the same amount of money for a fast, simple taste of computation or one that was maybe 20, 30, or even 100 times larger. If they weren’t as careful or as ruthlessly efficient, it didn’t matter. The bill was the same.

That lazy freedom is gone. As more and more serverless platforms compete, the size of the smallest grain of computation is getting smaller. AWS just stopped rounding up to 100ms and started billing in 1ms increments. Now cleaning up your code and watching for slow detours will be reflected in a lower bill.

2 - A diet for the OS

The modern operating system is a wonderful Rube Goldberg machine designed to juggle bits for print jobs, video games, document editing, and a bazillion other tasks. Have you ever wondered why you were booting up your cloud instances with all of the code ready to handle these tasks?

Unikernels are one way to strip away all of that complexity. Once you get your code running, the compiler will build a much smaller package that will live on top of the standard hypervisor. The proponents don’t just celebrate the efficiency of leaving out all of those extra libraries, but they point out that the attack surface is much smaller, making them easier to secure.

Or why not get your minimal operating system straight from AWS or Google? Google’s Container-Optimised OS and Amazon’s Bottlerocket apply the traditional virtualisation paradigm to the operating system, with containers the virtual OS and a minimal Linux playing the role of the hypervisor. They shine for wrapping up microservices that do one small thing, without leaning on much of the functionality of the operating system.

3 - Open source functions

Another way to simplify the job of deploying to the cloud is to let developers write a simple function and leave all of the other work to the cloud itself.

Over the last few years, all of the major clouds unveiled their own tools that allow a small fragment of code, a single function, to make some decisions and process some data. These were wonderful advances, especially for people knitting together many services into one big product.

The only downside was the vendor lock-in. While the functions could be written in many languages, the interaction with the framework was proprietary. Now there are a number of interesting open source projects—OpenWhisk, OpenFaaS, Kubeless, Knative, Fission—bringing functions-as-a-service to any machine of your choice. You can read about all of these FaaS options here.

4 - Arm chips

They’re not just for Mac lovers. Amazon has a line of servers running their own Gravitron chips with Arm cores that promise to be 40 per cent cheaper to run. Of course there are some caveats. You’ve got to recompile your executables to run on the Arm platform, unless you’re working in higher level languages like Java, JavaScript (Node.js), or PHP (Drupal, WordPress, etc.).

Figuring out whether you’ll see a big savings will depend heavily on the nature of your computation and the load. Some benchmarks place the Gravitron machines in the same general range as the original Intel-based machines. Others suggest the Arm-based instances are a bit less capable, so more suitable for running lightly used code that can enjoy the cost savings without running longer.

Should you make the switch? The only way to find out is to test your own workloads in close to production environments to see whether they can enjoy the savings.

5 - Do-everything databases

Long ago, Fortran programmers watched as Fortran added exciting new feature after feature and quipped that they didn’t know what the programming language of the future would be but it would be called “Fortran.” Today, that same thing might be said for databases, the original micro-function as a service layer. Long ago, databases stored rectangular tables. Now they do almost everything.

Developers are starting to notice just how much is under the hood. PostgreSQL 11, for instance, has its own JIT for compiling queries, and its embedded functions now have the ability to commit or rollback transactions. The database speaks JSON and so it’s easier than ever to build a full microservice without leaving the bounds of the database.

Other databases like Azure Cosmos DB combine SQL, MongoDB, Cassandra, and graph APIs. Still others like Google’s Firebase offer the opportunity to both store the data and deliver it to clients through replication. They merge distribution with storage. There are dozens of new databases like this bringing new functionality to the basic, utilitarian term “database.”

6 - New roles for office applications

The grids full of numbers, letters, and formulae are the lingua franca for the bean counters and managers, not the coders, but they’re getting more respect as a smart file format and a way to open up the cloud to the masses.

The “no code” movement is  cutting the programmers out of the loop and reaching out to the macro jockeys directly with tools that turn spreadsheets into apps. Google, for instance, has been bragging that one company built and deployed more than 35 business apps with “no coding skills” using AppSheet.

It’s not just the spreadsheet. The entire suite of office applications is now home for more and more of the custom applications that run a business. Instead of building and deploying apps to stand-alone instances, some coders are building apps that integrate with the word processors, slide presentation builders, and other generic tools in the Google or Microsoft universes.

When the connections are there, it makes life easier for everyone in the enterprise who spends most of their time juggling documents and email.

7 - Computing at the edge

The cloud continues to evolve by pushing more and more computing power to the edges of the network. Companies like Cloudflare were once dumb caches. Now they offer smart computational services. The Cloudflare Workers will run JavaScript, Rust, C, or C++ code in one of their 200-plus data centres located as close to users as possible. There are also local databases too.

Amazon’s AWS for the Edge service offers a similar opportunity to move your code closer to the users. Those who use the Amazon SageMaker machine learning services can push them out of the major data centres to the edges.

Amazon is now emphasising their connection to the emerging 5G cellular networks, no doubt expecting that mobile devices will change from casual consumers of expensive data to the main portals through which people will do most of their internet browsing.

Microsoft’s Azure IoT Edge is targeting the explosion of devices with features that depend upon the cloud. The Custom Vision service, for instance, brings edge computing to all the burgeoning networks of cameras.

There are dozens of examples like this that are being developed as the cloud grows and insinuates itself everywhere. The cloud was once limited to a centralised collection of data colocation buildings with rentable instances, but now it’s moving into the network. After that? The internet of things? The smart oven in the kitchen? The computers in our cars?

If cloud computing weren’t so useful, it would be tempting to deploy science fiction metaphors like the Blob or the Borg. But as long as it’s easier to rent than own, the cloud will be embraced by anyone who needs computers and data storage to carry their enterprise forward.


Tags Cloud

Show Comments