How servers and cloud services are evolving into their next generation of utility and application to include databases, servers and other essential services a business needs to operate efficiently and safe from security breaches, downtime and technological obsolescence.
The prevalence of “cloud computing” has become a bit of a silver bullet. Businesses use the phrase or technology as a solution to all their problems as frequently and confidently as a politician on the spot uses “education” or a doctor prescribes “rest”. For more than a decade now, the answer to many an IT puzzle seems to be “the cloud” but rightly or wrongly Cloud just facilitates existing business processes in a secure, remote location and has a lot more function than just upload and access.
The term “Cloud Computing” contains a lot of intricacy and misunderstanding, as its meaning has been changing frequently over time and certainly since its inception.
Even before the term properly caught on, timeshare systems were available for purchase from mainframe companies who had the large capacity and could sell spare and unused storage and computing power and services to other users and their corporate clients. Companies like America Online were the first to offer these services even under the appearance of entertainment. Others, who banded together to form data centres, hired servers through Unix accounts and allowed clients to upload and save data to these storage centres.
The progression of the uses of the phrase and what it captured under its umbrella continued through its life. The earliest forms of cloud were essentially lots of small virtual machines running on large servers. Root privileges were given to users who were running on just a small section of the machine. The FTP servers then changed their monikers to dropboxes and buckets.
Ever since, the demand for these services has skyrocketed, and the Address Allocation Server, which provides an IP address allocation service to other network services, has become a huge selling point as anyone with the IP address could upload and download to it, providing they had the login credentials.
As the changes continue at pace and requirements creep, users want to make the most of each innovation in the cloud, so their expectations grow, trying to get the most for their money. A lot of these however, are old ideas in new packaging, recycled and refreshed. Others are smart solutions for problems that were inherent in the previous generations.
All of it is an opportunity to take stock of what already exists and see what new creations can be made. If any industry stands still, it risks loss of clients and users, and the same is true for cloud services.
Below are seven major ways that the cloud is evolving
Open source functions
A common method of deploying to the cloud is based on a simple function written by developers and then letting the cloud complete the rest of the work itself. Normally, then, this interaction with the framework becomes proprietary. This is because it is the cloud’s tools that eventually allow the small bit of code to make decisions and process data. This technology is extremely useful for companies that need to blend a variety of services into one integrated product.
To counteract this vendor lock-in, many open source projects have arisen – Knative, OpenFaas, Fission, Kubeless, OpenWhisk, among others. This allows the functions created to interact on any local machine as required by the user and doesn’t lock the users into a proprietary source code or programming language.
When they first came about, databases just stored basic tables of data. Now they can be as sophisticated and as evolutionary as a programming language. As new features get added, they are still called databases, but they are capable of doing almost everything.
Developers are noticing the true potential that lies beneath. Databases like Azure Cosmos DB can combine functionalities from MongoDB, SQL, Cassandra and API’s. Google’s Firebase combines distribution with storage by delivering it to their clients through replication.
PostgreSQL 11 allows for compiling queries within itself through its own JIT, and one can rollback or commit transactions through its embedded functions. It further creates its own ecosystem by speaking JSON, so one could easily build a full microservice without having to use anything outside the database.
Cost reducing granularity
The addition of more serverless platforms to the competitive landscape has incentivized clean and money-saving coding. In the earlier days, the minimum computational unit calls were fairly high (e.g. AWS Lambda rounding up and charging for 100ms calls at the least).
This meant that users and their IT were having to pay for blocks and paying for multiples they didn’t use or need. Now AWS from Amazon, for example, can bill in single MS increments, so the costs can be far more accurate in terms of the calls made to the system. This rewards and encourages code that is clean and makes fewer detours, which in turn helps save costs and make the costs more transparent to clients.
A major benefit of cloud computing is how it can bring computation and data storage closer to the location where it is needed to improve response times and save bandwidth, i.e. to the edge of the network. By having services and data centres geographically spread out, providers can route or allocate resources closest to the client. Companies like Cloudfare whose purpose was essentially to just be a lightweight in-memory cache with zero dependencies now offer smart computational services.
They will run your code from 4 different languages as close to you as possible from one of their 200+ data centres.
The point is to get the code away from the major data centres and congested locations so that response times can be quicker. For example, Amazon SageMaker is doing exactly this, and Amazon is doubling down on their close links with 5G networks because they view mobile transmission and web access as the growth area for the next decade. To take advantage of the expectation that mobile networks will be the major channel through which internet data is transmitted in the future Amazon will allow mobile access.
New uses for Spreadsheets
Spreadsheets, for so long the refuge of the accountants and the strategy managers, are now being used to create apps. Google’s AppSheet has apparently been used by one company to build and deploy 35 apps without any coding skills. It is becoming a way to give access to the cloud to the masses.
Its functionality as a smart file format is gaining traction. So much so, that a “no-code” movement has sprung up, allowing people with just macro skills to remove the layer of programmers and create apps themselves, directly from spreadsheets. Afterall, the user is the one doing the process, to have to explain their task and the desired requirement to a developer is just another layer and opportunity for miscommunication to occur.
Going past spreadsheets to other office suite programs, companies can integrate as many of them as they like into their own custom applications. Word processors, slide show builders, project management tools, among others, can now be part of an internal connected system rather than discrete entities, which makes life easier for people who have to switch between various apps as part of their daily routine.
Cutting down OS bloat
A unikernel negates booting up anything superfluous to the task at hand. A conventional OS does exactly the opposite, keeping everything ready for any task. Unikernels remove this complexity, allowing for a much leaner and more efficient system. Not only is leaving out unnecessary libraries efficient, but it is also far more secure, as the surface exposed to attack is much smaller.
Two of the biggest software operators in the industry, Google and Amazon offer this minimal OS service through Container-Optimized OS and Blottlerocket, respectively. They apply the required virtualization and provide the minimal Linux which acts as the hypervisor. These are great for microservices with one small function, without the need for the larger and more bloated uses of the full operating system.
Arm (Advanced RISC Machines) is a set of reduced instruction set computing also known as RISC. These are system architects for processors designed for a variety of IT environments.
Amazon are running their own Gravitron chips with arm cores on a line of their servers, which they claim will be 40% cheaper to run. This does take a bit more developer work though – executables need to be recompiled to run on the arm platform, except for certain higher-level languages like Java or PHP.
There could be possibilities of cost savings depending on workloads in close to production environments. It would depend heavily on the nature of the computation and load. Some benchmarks claim its range is similar to Intel, while some say it is slightly less capable, making it more suited to code that isn’t used very heavily, allowing for cost savings for the shorter durations.
There are many more advances and technology application for cloud that keep coming up as the adoption expands and inculcates itself everywhere. Costs are coming down and with large Chinese companies like Alibaba driving the competition and driving down prices, it’s only good for the consumers of cloud services. From its humble origins where it was limited to rentable data storage to its potential ubiquity throughout all our networks and companies, we can expect to see its uses permeate through every aspect of our lives.
The cloud will be welcomed with open arms by anyone who needs to store and process data to move their business forward. The advantages are attractive for small to medium sized companies to leverage big company scale and technology to keep their businesses up to date and secure.