If there is one thing that is experiencing unprecedented growth and burgeoning popularity these days, it is cloud computing. Now, this is not anything new and has, in fact, been a buzz word for several years now and has continued to grow its fan base with no signs of slowing down. As wonderful as the cloud is now, it wasn’t always this way, since it had to go through growing pains as part of its evolution. To understand how cloud computing got to where it is now, it is important to take a trip back to the past and see how it all started. This is where you will see just how far cloud computing has come, and how much it has changed since then.
Believe it or not, cloud computing technology is not actually a new thing, since the concept came from way back in the ’50s. Back then, they had large-scale mainframes that were offered to corporations and schools. The mainframes had such massive hardware infrastructure that they would be installed in “server rooms” because each room had only enough space for one mainframe. Then, multiple users were given the chance to access the mainframe by way of “dumb terminals,” which were actually stations used solely for facilitating access to the mainframes.
These mainframes were hugely expensive to buy and maintain, and what this meant was that organizations were not able to maintain a mainframe for each and every user. This made it standard practice for multiple users to access the same data storage layer and CPU power from any given station. By allowing this to happen, an organization was able to get better ROI from their mainframes.
The Server Evolution
The way servers evolved had a direct impact on how the cloud evolved. For instance, if you were thinking of buying a server then, you would have to shell out tons of money. Of course, in the ’90s, the cost was nowhere near as high as in the ’50s but they were still pretty steep. However, as more and more people started to go online, the only thing to do was to adopt virtualization, and this was what led to servers being virtualized into different hosting environments. This kind of environment allowed companies to save on the cost of infrastructure by minimizing the actual amount of hardware needed to meet the needs of the company.
This was the same time that the term “cloud computing” started getting thrown around. That and “utility computing” were the two terms used to refer to it, since it seemed like the sum of the parts melded together to form one nebulous blob that was made up of computing resources that could then be segmented. With these cloud environments, it was extremely easy to keep adding resources: simply add a server to the rack and it immediately became part of a bigger system.
These days, using the cloud can be as easy as: creating an account, downloading and installing the application on the computer, and moving files into selected folders in the cloud. Also, cloud technology has become affordable, and this means people need not pay an arm and a leg to enjoy the technology.