Software Defined Networking - "The Biggest Thing Since Ethernet"

SDN Journal

Subscribe to SDN Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get SDN Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


SDN Authors: John Rauser, Amitabh Sinha, Mike Wood, Liz McMillan, Carl J. Levine

Related Topics: Cloud Computing, Storage Journal, SDN Journal

Article

SDN, SDS and Agility | @CloudExpo #BigData #SDN #DataCenter #Storage

Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project

IT planning is an imprecise science that allows IT experts to increase the flexibility and agility of IT environments while reducing the bottom line. In an ideal world, where time and budget are not limiting factors, upgrading an organization's infrastructure happens on an ongoing, as-needed basis.

In the real world, IT administrators have to make decisions about the hardware they put in place and how to maintain acceptable service levels over the course of the equipment's expected life. Most businesses do not have the luxury of replacing their current storage systems when IT demands outpace their current infrastructure.

Even with the best IT planning, applications continue to advance and make additional requirements of storage network resources, leaving organizations to face the dilemma of operating at suboptimal levels or replacing their infrastructure at great cost. An advanced SDS solution is needed to handle the current and future demands of data storage, and the security and performance of applications.

Plan with Ease
Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project. Building onto the existing infrastructure, it would combine any additional devices with the SAN, presenting as a combined storage solution. When IT needs to add new storage media, such as flash for performance, inexpensive hard drives for capacity, or overflow to cloud storage, it seamlessly integrates with existing storage. All systems and applications, both new and old, remain fully available and centrally manageable.

The SDS solution would automatically maintain application Quality of Service levels by monitoring performance, optimizing capacity, and managing data placement and protection. By constantly reviewing storage activity, the software is able to adapt in real-time as demands and workloads change.

This all means IT no longer has to worry about finding the budget to replace their current storage systems - it means they can plan and add what they need, when they need it.

If the desired performance, capacity, or protection policies fall out of alignment, the software will automatically resolve them. In the case of performance, the data will simply and transparently be migrated to faster devices or closer to the workload to reduce network latency. If the volume's capacity is low, it can overflow to the cloud so that admins only need to buy and maintain minimal physical storage. In the case of protection, the software can increase the number of fault domains or replicas for easy backup.

With little-to-no downtime, delays, provisioning, or manual data migration, the SDS solution seamlessly integrates into the underlying infrastructure by placing storage services in front of an existing storage device or system. By optimizing current features and functions of the existing SAN or NAS device, the software provides improved performance needed for the environment, making IT planning easier than ever before.

IT Management Agility
The SDS allows you to specify how big, how fast, and how secure a volume (or workload) should be. This translates to capacity, performance, and protection policies in the software dashboard.

Each volume has an associated Quality of Service (QoS) policy, which describes how it should be managed for storage allocation, data migration, and performance scaling and throttling. The policy is implemented from the point of view of the application, (i.e., the access point to the storage service).

Each volume has an associated pool of eligible storage resources for data in that volume. Automation uses policy information to change the members of the pool as necessary to provide the resources that will help maintain the QoS policy for the volume, or alert if it cannot do so. Most details of a volume can be changed dynamically.

A powerful SDS solution allows users to manage diverse storage systems and resources with one dashboard, reducing the IT knowledge required to make effective use of multiple types of storage devices. Through intelligent automation, the software eliminates manual data migration efforts by identifying, profiling, and utilizing new storage resources across the enterprise. The software achieves near-zero downtime through automation, reduced complexity, and data protection features.

Without manual intervention, data moves among storage resources to maintain QoS levels, adapting in real-time as demands and workloads change. The software becomes aware of new resources and automatically moves appropriate application data to them, continuously monitoring requests, analyzing priority based on performance, latency or bandwidth, and physically moving data to the most appropriate media. To speed up access to business-critical information, the software automatically moves data from slower storage hardware within the existing storage solution to faster-access media volumes.

The next stage of SDS
Data storage of the future will become something that companies can simply rely upon-not something that is costly, time consuming and requires specialized staff to maintain and manage. An SDS solution should provide advanced storage automation that unifies existing storage resources, centralizes storage management, simplifies the deployment of Flash, and improves storage utilization-delivering application specific quality of service levels. If your current SDS solution is not doing all this, then you need to find one that is. The future is waiting.

More Stories By Steven Lamb

Steven Lamb is the CEO & Co-Founder of ioFABRIC. He forged himself as a data storage expert with server-side caching at Nevex Virtual Technologies and now with ioFABRIC, he has a game-changing product in the data storage arena. Steven is a successful serial entrepreneur on his fifth venture, bringing a broad range of strategic positioning, management skills, and leadership experience.

Steven’s first company, Border Network Technologies, became the second largest firewall vendor worldwide. Others included INEX, Nevex Software and the most recent, NEVEX Virtual Technologies, a cache acceleration company.