By Martin Vorster, Account Manager at Aptronics
We’re living in an age of ever-expanding data. As the IT landscape continues to evolve, businesses are becoming increasingly data-driven, and as organisations grow and adapt to change, they will typically need to store more information.
Data storage has come a long way over the past few years. Cloud computing has had a significant influence, with more and more businesses steadily beginning to adopt cloud-based storage solutions. According to a study by LogicMonitor, this trend is largely being driven by digital transformation, the need for greater agility, DevOps, artificial intelligence (AI) and machine learning (ML), along with the Internet of Things (IoT). A 2020 survey by Spiceworks shows that the adoption of cloud storage is on the rise, with 39% of businesses using cloud-based storage infrastructure (on AWS, Azure, and Google Cloud, etc.), and an additional 20% planning to by 2022. Many companies are taking a hybrid approach which means data and applications move between a mix of on-premises, private and public cloud.
Spiceworks reports that all-flash storage arrays are growing in popularity on-premises due to their reliability, high performance and affordability. According to the same report, many organisations have moved to hyperconverged infrastructure (HCI) solutions, which combine servers, storage, networking and virtualisation resources in a single scalable stack with a unified management interface and a single point of support. HCI delivers software-defined flexibility that is easy to manage and scale, enables greater productivity—which saves time—and costs less than more traditional storage.
As HCI has become more widespread, IT teams have begun to deploy it for different types of workloads, including mission-critical workloads. HCI configurations have been expanded by adding tightly coupled nodes of compute and storage resources together to deliver high levels of availability. However, not all workloads are created equal. Some are particularly storage-intensive and need more data capacity than compute; similarly, others require a higher degree of compute than storage. IT can add additional nodes to meet a high storage demand, for example; however, if the workload doesn’t require more compute, then a resource has been included that won’t be utilised effectively, resulting in over-provisioning and unnecessary cost.
Resiliency is also compromised if a node is lost during planned upgrades or other outages, as the business will lose access to its applications and data, which will impact performance.
Taking HCI to the next level
To tackle these challenges, disaggregated hyperconverged infrastructure, also known as hyperconverged infrastructure 2.0 or HCI 2.0, was developed. HCI 2.0 expands on the simplicity of HCI but introduces the concept of disaggregated hardware. Instead of integrating compute, networking, and storage resources at the node level, these have been separated and can be scaled independently of each other to ensure consistent performance, resiliency, and high availability. The efficient use of available resources reduces over-provisioning, and the mix of storage and compute can be easily adjusted according to demand.
HPE Nimble Storage dHCI
Hewlett Packard Enterprise’s (HPE) Nimble Storage dHCI (which stands for disaggregated hyper-converged infrastructure) is an intelligent platform that disaggregates compute and storage and integrates hyperconverged control. HPE ProLiant servers and Nimble Storage arrays have been incorporated into a single system to provide simple management on a flexible architecture that allows compute and storage resources to be scaled independently and non-disruptively.
The architecture is designed for business-critical applications and mixed workloads that demand the performance, resiliency, and high availability of a SAN environment. HPE backs this up by promising a 99,9999% or ‘Get 6-nines’ data availability guarantee along with sub-ms of low latency at consistent high performance.
HPE Nimble Storage dHCI is as simple as HCI to install, manage, scale and support. According to HPE, it takes just 15 minutes to deploy from rack-to-apps using server and storage automation, and it is quick to manage and scale, helping business speed things up and accelerate their time to market.
All resources are software-defined and are simple to manage centrally through VMware vCenter. Virtual Volumes enables VM-centric data services (such as replication, backup, snapshots, etc.) and resource management. The system uses policy-based automation, which is easy to set up and eliminates manual processes so that applications can be deployed quickly. All resources are auto-discovered, which makes it easy to build out the platform.
Support made simple
HPE InfoSight is at the heart of HPE Nimble Storage dHCI. HPE InfoSight is the predictive analytics and artificial intelligence (AI) platform that provides global intelligence and makes support simple. It collects and analyses millions of sensors from every HPE Nimble Storage dHCI stack and VM layer in HPE’s installed base to detect, prevent and auto-resolve issues, improve performance and optimise resources. HPE InfoSight learns to recognise and predict problems and stop the same issue from reoccurring, significantly reducing support escalations.
HPE Nimble Storage dHCI is available as-a-Service, through HPE GreenLake. HPE GreenLake is a flexible pay-as-you-go consumption-based IT model that provides a cloud experience that helps businesses to speed their time to value and simplify IT management. Companies pay monthly for what they use, and their infrastructure is managed for them, which frees up their IT teams, enabling them to be more productive.
Compute and storage can be scaled on-demand and extended across a hybrid cloud, which provides agility and eliminates the risk of over-and under-provisioning capacity. There are no upfront expenses—capital is converted to operating expense which lowers the total cost of ownership.
Getting data storage right
Data is one of a business’s most valuable assets. It needs to be carefully collected, managed, stored, and protected. Data storage also impacts an organisation’s performance, reliability, and scalability and is critical to its success. However, every business is different, and there is no one-size-fits-all solution—there are many factors to consider before deciding on a data strategy and storage system.
Aptronics and HPE have the knowledge, experience and technical expertise to assist and guide organisations in choosing a storage solution that best meets their business’s current and future needs. Contact us to learn more.