Data Storage: Past, Present and Future | Secure Cloud Backup Software | Nordic Backup

While we don’t want to sound like an old man on a porch talking about “well, back in my day the cloud only meant one thing”…

 

297
It is important to understand how far we have come since the days of early data storage to right this very moment. Like most things in life, it is helpful to understand your past, present and future.

 

1960s

 

1960s

Before the decade of political assassinations, Vietnam War and Woodstock, the military developed a massive machine titled ENIAC which stood for Electronic Numerator, Integrator, Analyzer, and Computer. It weighed 30 tons and took up 1,800 sq ft of floor space. Multiple technicians were needed to keep it running, while it did 5,000 operations per second

Starting in 1960, computers converted from vacuum tube to solid state devices such as the transistor. This meant they lasted much longer, were smaller, efficient, reliable, and not as expensive. By the early 1960s most computers cost about $5 million each! Although, you could rent one for only $17,000 per month!

Later on in the 1960s, American Airlines and IBM teamed up to develop a reservation program termed the Sabre® system. It was installed on 2 IBM 7090 computers, located in a specially designed computer center in upstate New York.

 

1970s

The decade that belonged to Watergate and disco, saw some data improvements. In 1973, what we now know as “data centers”, were all around the United States documenting formal disaster recovery plans. If a data disaster did strike, it wouldn’t necessarily affect business operations. These functions were batch operations and not very complex in nature. In 1977, the world’s first commercially available local area network, ARCnet was first put into service at the Chase Manhattan Bank, in New York City, as a beta-site. Mainframes required special cooling, so in the late 1970s air-cooled computers moved into offices.

 

1980s

During the MTV, “greed is good” decade of the 1980s, the computer industry experienced a renaissance thanks to the IBM Personal Computer (PC). Computers were installed all over the place, and little thought was given to the operating requirements, and environment, of the machines. 1985 saw IBM provide over $30 million in products and support over the course of 5 years to a supercomputer facility established at Cornell University in Ithaca, New York. In 1988, IBM introduced the IBM Application System/400 (AS/400), which became one of the world’s most popular business computing systems.

 

1990s

In this decade, in the infancy of the “world wide web”, servers started to find their place in the old computer rooms known as “data centers”. Companies were putting server rooms inside their company walls with the availability of inexpensive networking equipment. With this came the infamous dot-com bubble. Companies needed fast internet connectivity and nonstop operation to deploy systems. Some companies started creating large facilities to provide a multitude of businesses with a wide range of options for systems deployment and operations. 1999 saw Rackspace Hosting open up their first data center to businesses.

 

2000s

Y2K and ever since, has seen the number of government data centers grow at an amazing rate. There are millions and millions of new servers deployed every year. As online data grows at rapid rates, there is a clear need to run more efficient data centers.

 

database-backup-strategy

THE FUTURE

The past is fascinating to look at how far we have come. The current age is important to understand what is happening as I type this. But the real fun comes from the future. What will data centers look like in the 2020s and beyond? A current trend that might carry over is the convergence of servers and storage into a single box. The next big change in data centers will be geared more towards a software shift. Beyond the 2020s, we will most certainly move away from ancient (the stuff way back in 2010) data techniques.

Share This

nb@nordic-backup.ru