The Three Trends Changing Data Centres
29 March, 2016
The storage industry is undergoing something of a development phase, with three trends emerging to change the way that data storage is considered. We could go as far as to say that the storage industry is fast becoming a revolutionary field within the technology sector. Big companies like Amazon, Facebook and Google are partly responsible for the changing face of data companies but there are plenty of smaller brands also making waves. Then larger names have devoted a huge amount of resources to build and maintain customised data centre infrastructures. This investment from a ground-up level has created tremendous levels of flexibility, scalability and efficiency. With growing requirements for data storage globally, the research and innovations pioneered by the big three will have implications for the development of data centres as a whole. We’ve rounded up the key trends to look out for as a result of their work…
Cost is a major factor in data storage. This is made worse by the fact that poor design sometimes leads to a less than efficient solution- many resources are unutilized and underutilized for example– which has cost implications. The traditional architecture physically attaches shelves to controllers, meaning there is a finite amount of performance and capacity. Unless a customer exhausts both performance and capacity, one of these is going to be underutilised. This is very wasteful not only in terms of up-front costs but also in the day-to-day cost of keeping the storage facility up and running at below its capacity level. New architecture is being developed that addresses these problems.
An alternate way of thinking started at social network, Facebook with a disaggregated approach. This physically separates the infrastructure into the different component functions before connecting them via Ethernet. This means companies can gradually add to the system as their requirements change, making it possible to seamlessly increase performance without changing capacity and vice versa. This approach to utilisation can also have a drastic effect on the amount of physical resources required to fulfil performance.
Integrated systems for storage are extremely expensive. However, this was previously the only way enterprises could ensure reliability, availability and supportability from their data storage. Now though, a new generation of software-focused vendors are delivering both the reliability and the accessibility needed for enterprise level data storage. The shift from integrated systems is well and truly on track.
Having the proper storage architecture in place can be deeply beneficial. Integrated systems have a limited capacity and performance models rooted to the hardware. This means there is a higher cost for replacement. A software first approach works with plug and play hardware that is much easier to replace. Storing data in this way offers a more functional and cost effective system.
The definition of scale has radically changed. It has gone from how large a file system a storage solution supports to now being defined as how many file systems or connections a system can support. Traditional storage solutions struggle to adapt to this changing definition. Modern systems can support hundreds of thousands if not millions of workloads. Better systems also prioritise the workloads in order to support different requirements.