Growth in artificial intelligence (AI) and machine learning (ML) technologies is driving new demands across the Federal enterprise for secure storage and effective data management. Data is no longer just accumulated and stored. Teams rely on robust, timely data analysis to support daily operations and innovation. Secure storage, access, and management of data are critical to agency missions.
MeriTalk recently connected with Mike Lamb, product manager of solution infrastructure at ViON, and Robert Renzoni, director of technical sales for the Americas at Quantum. Lamb and Renzoni discuss how agencies can manage and preserve their data, and easily extract insights from it – whether it lives on premises, at the edge, or in a public or private cloud.
MeriTalk: As Federal agencies increasingly use artificial intelligence and machine learning to enhance mission-critical capabilities and accelerate modernization, what challenges do teams face? How have their data needs changed?
Rob Renzoni: One of the challenges that teams are facing is legacy IT infrastructure, which lacks interoperability because applications are siloed. Beyond that, we’re seeing unstructured data assets growing tremendously. Federal teams are rapidly moving to cloud to provide greater access to data structures, but the management problem remains. Agencies are expending more time and resources to figure out how to store data in a way that keeps it accessible to analysts and decision-makers, and ultimately, for mission operations.
Mike Lamb: The management problem is spot on. When agencies move their data up to the public cloud, they have to then pull it back down to extract valuable insights. But large egress charges that they incur when they pull the data down from the public cloud contradict the effectiveness, as well as the cost savings, expected from public clouds. It leads to agencies feeling a lack of control when it comes to where and how they access their data. It also is very difficult to budget.
Renzoni: When you’re talking about AI and ML, you have to be close to the data set. Depending on how far away you are from that cloud data center, your data may be affected by latency or other issues that slow down your workflow and impact the customer experience. When that happens, agencies may shift from the public cloud to private cloud. To accomplish that, they’re leveraging technologies like object storage that will give them the same cloud-like feel – but closer to where the data originated.
MeriTalk: What are the key steps agencies can take to ensure their data is both accessible and reliable? How does object storage play a role?
Lamb: You want to make sure that your technology meets your requirements and your needs, and that it can grow with those requirements and needs. Solutions that provide for data agility are paramount. Object storage provides a flat storage space where data and metadata are attached to one object, and accessible through APIs, so teams can tailor their applications to make calls against their data. Object storage enables teams to analyze and manipulate data quickly. It’s highly available, resilient, and very secure – you can count on that.
Renzoni: Agencies might move to public or private clouds in order to enhance accessibility to information resources that were previously locked in proprietary or inaccessible desktop applications. However, when you are designing any architecture for a workflow, you need to do your homework. There are a lot of nuances to each individual deployment. Dig into the underlying technology.
Reliability comes from how well the data is protected. Today’s private clouds are built upon object storage, which leverages erasure coding to achieve extreme data durability. You have a better chance of winning the lottery ten times than losing data in a properly coded object store environment. It’s one of the most reliable ways to store large amounts of unstructured data.
MeriTalk: In what ways does object storage support effective and resilient operations at the edge?
Renzoni: An example of an edge application includes the adoption of 5G. There’s going to be a tremendous increase in data that’s flowing into 911 call centers. These centers are pretty much everywhere. A private cloud and object storage – with an AI/ML workflow analyzing the data – can provide actionable insights to first responders faster than one person or team ever could.
Consider critical environments where enormous amounts of data flow. While one person or team may not be able to keep up, AI/ML can help process and sort data in real time to provide immediately actionable intelligence that otherwise may be missed.
MeriTalk: Security is a mission imperative, of course. How are recent policies, such as the Executive Order on Improving the Nation’s Cybersecurity, affecting how agencies approach data storage and management?
Renzoni: Agencies are tightening up security in every aspect. Protection against ransomware is an increasing priority. We’re seeing more focus on how data is encrypted and how we can lock down access protocols to make sure that not everybody has access to the data.
The vendor community has been quick to react with new features to address the rising number of cyber threats, but some legacy technologies are also resurfacing. For example, tape libraries are a truly air-gapped way to provide effective protection against ransomware and malware.
A U.S. university recently faced a carefully planned cyberattack where Trojan horse malware was introduced to their systems through fraudulent emails. The malware attacked their file system and spread between physical and virtual servers, laptops, and devices such as thumb drives. Once on the disk, the malware encrypted the files so they could no longer be read. However, the university had an air-gapped tape backup, which saved their data.
More and more agencies are adding tape libraries into their backup architectures to mitigate the damages of successful attacks.
Lamb: Cybersecurity is key, especially with so many remote workers. As agencies accelerate zero trust, they are looking for more role-based access controls within their data centers. Data storage should align with and support those role-based access controls. Air gapping data and isolating it from the rest of the network is a way to make sure it is “set aside” and can’t be hit if a ransomware attack does occur.
MeriTalk: What do teams need to consider as they look for data solutions that support their mission needs, especially in terms of cost, effectiveness, and security?
Lamb: Agencies need to make sure that their data is stored in the proper storage tier based on their needs – from “hot,” active data that needs to be stored in a high-performance tier at the edge for quick access, down to “cold,” inactive data that can be stored more cost effectively with object storage or a tape library. Cost, effectiveness, and security will depend on what their data is being used for and how the agencies need to interact with it.
For example, object storage is ideal for user files, as this data doesn’t require low latency or high input/output operations per second (IOPS). However, AI-supported applications require more performance, making all flash or nonvolatile memory express (NVMe) storage a better fit.
Renzoni: The first thing to look at is ease of use. Secure, commercial-off-the-shelf products and services are readily available, so agencies shouldn’t need to invest in custom solutions. Scale is the other thing to look at. Not all solutions scale easily, and agencies could be at risk of seeing performance drop if their needs grow beyond their technology’s capabilities.
It’s also important to recognize that agencies’ data needs may shift over time. Data that is frequently accessed today may not be relevant in several months. Having the ability to retain infrequently accessed data in a cold storage tier is a cost-effective strategy.
MeriTalk: How do ViON and Quantum deliver solutions to support active and inactive data needs and empower the Federal mission?
Lamb: The ViON Forever Data Cloud uses Quantum technology to help agencies ensure their data is securely stored in the proper storage tier, whether a hot storage tier for active data or a cold storage tier for inactive data. We can integrate with every storage tier so agencies can move their data quickly between the different tiers based on their needs.
Data can be moved manually in emergent situations. However, agencies can also tier their data via policies, which automate this movement. For example, an organization might decide that all data not used in the last 30 days is moved to a cold, more cost-efficient storage tier.
Agencies can get the benefits of as-a-Service pricing in a consumption-based OpEx model, eliminating the upfront charges of traditional IT procurement using CapEx funding for hardware.
Renzoni: Ease of use and scale are top of mind when it comes to Forever Data Cloud. As teams scope out long-term projects, they need to consider how their data will grow over time. Object storage capabilities bridge the gap between data storage tiers including object and tape – allowing agencies to access their current and historical data quickly, securely, and easily. This ensures that agencies’ technology not only supports their current needs but also their needs three or four years from now.
MeriTalk: Any final words of advice for Federal leaders developing strategies for dealing with living data?
Renzoni: Invest in infrastructure modernization. The majority of IT budgets go toward operations and maintenance of legacy systems. Modern infrastructure has lasting impacts and enables the future growth of an organization. As teams make improvements to their infrastructure and their data management strategies, the rest of their operations will also improve.
Lamb: An enterprise-wide approach is essential. Storage environments and tiers need to integrate so teams across the agency can access data efficiently and use it in new ways. As agencies increase their use of AI and ML, their needs will change due to the fast pace of innovation in this sector of technology. Anticipating those changes is difficult, but it’s an important exercise to ensure an agile approach to IT modernization. Leverage as-a-Service solutions to start small and grow as your needs change.