Data center optimization is a never-ending project for federal agencies, and chances are you've found yourself in the midst of some heated discussions regarding the best way to address the problems of ever-growing data center sprawl. You shouldn't blame yourself for the expanding storage and server footprint your organization requires to remain operational. This is simply a result of an increasing wealth of data that must be made both accessible and secure.
Nevertheless, there are a number of initiatives that federal decision-makers have launched in an effort to make their data centers run with greater efficiency, security and accessibility. As you bear the brunt of heavier information storage and protection demands might want to consider following the trends set by these forward-thinking leaders in your quest for a superior infrastructure.
Ideally, you can execute these plans with more precision than your peers and make an even stronger case for your organization's modernization efforts - in many cases, things aren't going as smoothly as planned.
Stalling out
There's no stopping the ever-evolving diversity and volume of information produced by government agencies, but an article from FedScoop pointed out that organizations are not stepping up to the challenge effectively. Just how messy is the data center situation at the federal level? The source explored a survey from MeriTalk and Brocade revealing some alarming facts regarding the state of affairs in government IT. Of the decision-makers surveyed, 94 percent said they had dealt with network downtime as a direct result of overly complex setups, negatively impacting their missions.
"The complexity of some of those federal data centers today doesn't really lend itself to consolidation," said director of systems engineering for Brocade Daemon Morrell, as quoted by the source. "One really drives the other. What we are seeing is that agencies have problems with their mobility programs and then vendors come in with these quick-hit, proprietary solutions that fix the small problem but don't mesh with anything else."
This muddled approach to IT is simply unsustainable. If your agency has failed to meet the expectations of projects such as the Federal Data Center Consolidation Initiative, it may be time to consider taking a new approach to data center optimization. The question remains - what is the best way to untangle the mess that is federal IT?
Less is more?
While the reducing the number of the government's IT assets may not be the most intuitive move considering the growth of data and application demands, an article from Federal Times suggested that data center consolidation projects may provide a cure for the clutter and complexity that currently waste so many federal resources. The piece, written by Dave Gwyn, the vice president of Nutanix Federal, urged decision-makers to adopt converged data center technology utilizing software-defined elements rather than relying on app-specific servers and network components.
By switching over to a consolidated setup, you'll be able to get the most out of your IT resources while maintaining optimal efficiency from budget and energy perspectives. This is a powerful step toward realizing vision the FDCCI and other innovative projects.