There are many good cloud companies that do a perfectly good job. You click and they deliver a root login to a running instance.
All of them are good. Some even have areas where they’re the best. None of them, though, manage to measure up to the breadth and depth of Amazon Web Services (AWS).
The reason is simple: AWS has built out so many products and services that it’s impossible to begin to discuss them in a single article or even a book. Many of them were amazing innovations when they first appeared and the hits keep coming
Every year Amazon adds new tools that make it harder and harder to justify keeping those old boxes pumping out heat and over-stressing the air conditioner in the server room down the hall.
For all of its dominance, though, Amazon has strong competitors. Companies like Microsoft, Google, IBM, Oracle, SAP, Rackspace, Linnode, and Digital Ocean know that they must establish a real presence in the cloud and they are finding clever ways to compete and excel in what is less and less a commodity business.
These rivals offer great products with different and sometimes better approaches. In many cases, they’re running neck and neck with AWS. And if what you’re after is a commodity machine, well, their commodity Linux instance will run the same code as AWS.
Sometimes the competitors not only match AWS for commodity products, but they actually do a better job. These advantages often appear when the competitors link their cloud to parts of the computer ecosystem that they already dominate.
If you want to use .NET code, you’ll find it just a bit easier on Microsoft Azure. If you want to use Google’s G Suite of web-based office productivity tools, it’s no surprise that they’re well-integrated with Google Cloud Platform.
Still, for all of these competitors’ innovation and success, Amazon continues to outshine them in many ways—and the words “many ways” is a pretty good summary of Amazon’s approach. The company has evolved a strong, consistent style that might be described as overwhelming.
The AWS cloud offers at least 10 different databases and another nine different products lumped in a separate category called “storage.” There are dozens of different types of machine available in dozens of different configurations of RAM and CPUs and you can arrange for Amazon to scale them automatically when the load increases.
Indeed, the greatest advantage of AWS may be the sheer overwhelming number of options. Most of the time, someone there has faced the same problem that’s confounding you and they’ve set up a team to productise a solution. You just have to work your way through all of the options.
Here are just 14 of the ways AWS beats Microsoft Azure and Google Cloud.
1 - Neutrality
Amazon is great at so many things, but there are a number of important parts of the computing ecosystem where they really don’t have a dog in the fight. In these areas, they’re good about playing well with everyone.
Microsoft Office dominates many business spaces around the world and Google’s G Suite handles most of the others. It’s common to find that the various parts of the AWS cloud like Alexa for Business work well with both worlds. Microsoft’s Visual Studio has many devotees and so does Eclipse. AWS doesn’t play favourites and has an integration toolkit for each of them.
Amazon is not afraid of embracing standards and bringing them inside its big tent. If you run Microsoft SQL Server or Oracle Database, Amazon will provide what you need and won’t force you to rewrite your code to use its own proprietary storage model.
It’s easy to find the products of ostensible competitors running smoothly in the AWS cloud. The company’s breadth and depth is a result of being unafraid of reaching into many corners of the computing world and excellent at bringing those corners into the cloud.
2 - Full-service options
In the early years, AWS was aimed at delivering commodity machines that did everything that was asked of them. Lately Amazon has been adding built-out services that target certain niches—like a customer service call centre, in the case of Amazon Connect.
Maybe you don’t need Amazon Connect, but as AWS adds more of these targeted services, there’s a good chance that someone at AWS already built most of what you need.
They’ve already got plenty of tools for call centres, Internet of Things, mobile app support, business productivity, and, for those with their own network of orbiting satellites, a full-function “Ground Station as a Service.”
3 - Container focus
All of the cloud companies understand the attractiveness of containers like Docker, but Amazon is pushing further by building a special, stripped-down version of Linux called Bottlerocketthat has just enough code to keep the machine running but not much more.
Teams running microservices can choose it and quit worrying about extra cruft like FTP servers sitting around in the background. Amazon also intends to embrace containers even more by skipping over the traditional packages for security and feature upgrades.
These will be available as complete containers instead, so upgrades can be done in one step and managed with many of the same tools you’re already using to juggle your own containers.
4 - AWS Lambda
AWS Lambda started as a cute idea, a kind of simple shell script that could glue together all of the operations in the cloud. Users quickly turned to Lambda’s serverless functions to handle occasional computing tasks because it’s so much more efficient than dedicating a machine for work that arrives sporadically.
That might be a background process that runs once an evening, a corner of your microservices architecture that isn’t used often, or maybe just that blog full of your rantings that still hasn’t found its audience yet.
Eventually people realised that Lambda could be ideal for more complicated background processing, especially the kind that would run sporadically. If a server is idling more than it’s working, switching to the serverless model could save a dramatic amount of money.
AWS noticed and expanded the maximum RAM for Lambda functions to 3GB and maximum running time to 15 minutes.
If your processing is sporadic, Lambda could be the cheapest way to host some significant jobs in the cloud. And the use cases will only continue to grow. AWS Lambda functions could already manipulate most parts of the Amazon cloud and they’re infiltrating what’s left. Amazon probably won’t stop until every part of the AWS cloud will answer to the serverless commands.
5 - Broad AI platform
If your shop is obsessed with mixing AI into your stack, Amazon has almost too many options to consider. There are too many to list. They begin with basic tools like SageMaker for training models to respond to your data. These tools have attracted plenty of developers and that may be why the sales literature brags that “85 per cent of TensorFlow projects in the cloud run on AWS.”
The basic training, though, is just the beginning because AWS also offers a wide range of tools aimed at particular industries. Amazon Comprehend Medicalplows through unstructured medical texts looking for lifesaving treatments. Amazon Fraud Detection looks for malicious behaviour.
Developers new to AI can begin exploring with highly automated options like AutoGluon, a project from the AWS lab that is designed to make it simpler to dive in.
The Microsoft and Google clouds also have deep AI expertise and commitment, but Amazon’s wide range is hard to beat.
6 - 24 terabytes of RAM
If you’re running very big enterprise databases or you just believe that whoever dies with the most RAM wins, some of the new high-memory machines from Amazon are just what you need. Up to 24 terabytes of RAM can be yours in just a few clicks.
AWS gives you other ways to wear your big boy pants as well. They’ve worked with vendors like SAP to make sure their cloud instances can scale up as big as necessary. Are your Oracle databases getting too big? Amazon wants to host them and it has expanded its Amazon RDS for Oracle machines to support instances with as much as 64 terabytes of SSD storage.
7 - Distributed MySQL or PostgreSQL
To the programmer, Amazon Aurora looks just like either MySQL or PostgreSQL. You choose the syntax and, underneath the covers, Aurora will store the data in a fast, SSD-based, virtualised storage layer. That alone is a clever idea that lets the programmers use their favourite open source version of SQL.
There’s even more magic, though, because Aurora distributes your data over multiple machines in multiple zones. Your data is split between hundreds of storage nodes in three different zones, ensuring reliability and access speed. Aurora does parallel queries across all of the storage nodes to speed up access, sometimes dramatically.
8 - Burst models
Amazon recognises that not all cloud machines are grinding through endless streams of data, all day and into the night. Many machines like web servers have periods of high demand followed by a bit of a lull.
Many of the EC2 machines like the T3 line are designed to be burstable, which means you have some access to extra CPU when you need it. If your average load is lower than a baseline, then CPU load spikes don’t cost any more. If your web content goes viral and the spikes turn into a sustained load, well, AWS applies a surcharge.
The model lets AWS lower the basic price, making it more cost effective to choose a bigger machine just in case. You don’t need to be constantly watching the load while pinching pennies.
9 - EC2 Spot Instances
If you want a new instance and you want it now, AWS offers a list price. But if you want to spend as little as possible, and the job isn’t urgent, you can put in a bid in AWS’ spot market where the prices drift up and down based upon demand.
Yes, it’s a bit of a pain to make sure you’ve got a high enough bid, but if you’re willing to put in the time you can rest assured that you’re saving money by getting the market clearing price. The other cloud providers offer discounts for heavy usage, but AWS lets the market find the true bottom. Some late night during a holiday the price could drop lower than ever and your code can be there to reap the financial benefits.
10 - Alexa for Business
Alexa is a fun toy around the house. It makes a great gift, especially for anyone who can’t figure out technology. So why not bring it to the office, especially for those suits in the corner office who need someone else to read their email? That’s Amazon’s plan and the company has been building out much of the infrastructure for office management like booking conference rooms.
Even tech savvy folks can appreciate barking out commands instead of trying to figure out the right URL and login to get the right permissions to do anything. Alexa for Business will be your quiet office assistant, listening to everything you say in the office. Just don’t let it remind you of Hal in 2001: A Space Odyssey.
11 - Browser-based development
While AWS continues to support the kind of command-line development that is still favoured by many developers, it is also exploring the ways of turning your browser into the IDE for the AWS cloud. AWS purchased Cloud9, the browser-based code editor, and put it in a constellation of tools for managing the deployment of all of the code you write with it.
When you build your Lambda functions, you can do it right in the browser without downloading anything. There are also a number of so-called “no code” options like the GraphQL API builder that lets you build your API with a few clicks. Soon you’ll be able to do pretty much everything in the same browser you use to watch cat videos.
12 - Free game engine
Are you building a game? You could sign a contract with one of the well-known and independent game engines out there or you could choose Amazon’s Lumberyard, a game engine that is supposedly free.
In case this sounds too good to be true, Amazon answers your skepticism in the Lumberyard FAQ by explaining, “We make money when you use other AWS services to power your game.” In other words, you’ll only pay when your users show up and start driving up the load on the AWS machines.
It’s not clear whether this is a great business deal or not because the costs are all hidden in the fractions of the cent you’ll be paying for every millisecond of machine time, but you can always rationalise it by saying that you’re going to need server time anyway.
13 - Snowmobile
If the word “big” in “big data” to you means something approaching 100 petabytes, the Internet isn’t an option for moving your data. It may work fine for sending an email or even uploading a zip file with a few megabytes of data, but the backbone can’t move some of the largest data sets because they’re just too big.
Amazon built its Snowmobile because, when it comes to bandwidth, nothing beats a shipping container filled with disk drives. Amazon will drive the Snowmobile to your data centre so you can start importing data at the speed of your local network.
When it’s full, Amazon will drive it across the country to its data centre while tracking its movement with GPS just to be sure it doesn’t get lost. At the other end, it will import the information into either S3 or Glacier.
If you’ve only got to move one petabyte or so, Amazon also makes a smaller box called the Snowball that the company will ship to you just like that box full of books or clothes or whatever else you buy from Amazon.
14 - Archival processing
In the beginning, archival storage was designed to sit offline until it was needed to be fetched. Amazon’s Glacier Select lets you write SQL-like queries on the S3 buckets squirrelled away in the low-cost, archival storage bins. You don’t need to pay to fetch all of the bits. You don’t even need to fetch them. Glacier will find the right bits and leave the rest of the archive behind.