Disrupt or be disrupted
HCI’s ability to deliver key storage functions such as snapshots/cloning, replication and flash acceleration without the need for storage SAN expertise has proved a real game changer. Initial deployments have been largely focused on the midmarket, where IT organisations often lack storage expertise and are not bound to a specific storage supplier.
But as HCI ventures even deeper into the enterprise and cloud environments, the infrastructure architectures will need to become more efficient, agile and adaptable to help IT professionals shoulder the burden of rapidly growing data sets and workloads.
Deduplication and in-line compression capabilities allow companies to store more data in the storage footprint by eliminating redundancies as data is being written to disk or flash. Deduplication works well for reducing VM images and files and has moved into the primary storage space. Compression has become necessary for reducing application workloads because deduplication does not work well on database workloads. 451 Research believes both have a major role to play in performance efficiency because they allow HCI nodes to cache more data within expensive flash SSDs and PCIe cards.
Scale-out is a common capability in existing HCI. But most solutions have a rigid architecture that forces organizations to add compute, memory and storage in blocks. This can create inefficient silos with unused processing and storage resources.
Blog: 10 challenges in a Data Center Blog: Why you need to embrace automation