Page 14 - CARAHSOFT, November/December 2021
P. 14

Building a Data-Driven Government
The evolution
in AI technology
Increasingly sophisticated hardware and software make it easy to tailor AI systems to agencies’ needs
THERE ARE MANY WAYS
for government agencies to quickly
build and deploy an infrastructure that powers artificial intelligence workloads. In fact, we’re seeing this happen at scale and in a faster way than we’ve seen in the past.
AI and high-performance computing are intrinsically tied together. To process data and AI algorithms at the speed the technology requires, agencies need to upgrade their hardware. HPC architectures are the natural choice.
Today agencies can essentially buy HPC systems for AI workloads that are scalable, flexible and efficient right off the shelf. Dell Technologies and other companies
are building infrastructure solutions, and customers are layering on the software stack from places like GitHub. Companies are also creating composable architectures with virtual workloads across compute, memory, networking, storage devices and software.
Flexibility and built-in security
Alternatively, many customers simply want to deploy an AI-ready architecture that gives them the flexibility they need across their workloads. Such an architecture allows them to quickly start getting answers from their data because the vendor has already done the work of researching the best components and making sure everything works together
seamlessly and efficiently.
As agencies think about choosing an
infrastructure for AI capabilities, they should consider new architectures and compute capabilities from accelerator companies, which are constantly evolving. For example, early Dell PowerEdge platforms had only three accelerators,
but today, the number of accelerators has increased to seven. And that is a trend across the industry. We’ve also seen the rise of GPUs, intelligence processing units and field-programmable gate arrays.
Many of these solutions have built-in security capabilities so agencies can quickly layer on FedRAMP requirements or the appropriate Defense Department standards.
From data lake to federated learning
To optimize the performance
of their AI workloads, agencies need fast, secure access to data and the ability to glean intelligent insights across complex IT environments. Many agencies are creating massive data lakes and transferring data from the edge to those central locations before they conduct analysis on the data. That approach delays decision-making and requires paying for costly networking services.
By contrast, federated learning allows agencies to set up containers with AI algorithms at the edge so they can train machine learning models and analyze data right where it is
14
SPONSORED CONTENT
Shutterstock/FCW Staff


































































































   12   13   14   15   16