- Sponsored
- Tech
Surviving the big data tsunami with open source
The data government agencies deal with today is not what it once was, and the volume of it is exploding. Because of that, the expectation for storing data has become more challenging than ever before.
Today, “data is as unique as you can imagine. You have data coming in different sizes and formats: pictures, videos, medical data, sensor data. Couple with that, the volume is just increasing exponentially,” Shaun Bierweiler, vice president of public sector for Hortonworks, says in an interview with FedScoop TV.
In current environments where “data is the new king,” federal agencies can become overwhelmed by and begin to “drown” in their data. But new technologies, like Hortonworks’ open source big data management platform, allow them to escape that tsunami of data and receive the analytics and intelligence they need to focus in on their missions.
“They’re able to really focus in on what they bring in value to the mission,” Bierweiler says. “It really enables the art of the possible.”
Adding in open source capability to the mix only strengthens an agency’s ability to succeed in its mission.
“If you look at a legacy, proprietary system, you’re going to have a very narrow window of the capabilities you’re going to be able to support. You’re going to be at the mercy of that vendor’s requirements,” Bierweiler says.
He adds, “What open source gives you is the flexibility of choice, it gives you the ability to harness the evolution of the community and to bring it into a way that’s going to really maximize your value.”
See more about how Hortonworks’ open source solutions can help you manage your data.