Wednesday, May 13, 2026
Search

AI Infrastructure Race Accelerates as Enterprises Offload Security to Specialized Chips

Hardware makers are embedding security and networking directly into data center chips to handle AI workloads without performance penalties. NVIDIA's BlueField-3 DPUs now run full firewall software, while infrastructure operators prepare legacy facilities for GPU-intensive operations. The push reflects enterprise AI adoption driving demand for purpose-built compute infrastructure.

Salvado

March 17, 2026

AI Infrastructure Race Accelerates as Enterprises Offload Security to Specialized Chips
Image generated by AI for illustrative purposes. Not actual footage or photography from the reported events.
Loading stream...

NVIDIA's BlueField-3 data processing units now run FortiGate firewall software directly on the chip, enforcing zero-trust policies without slowing GPU workloads.1 The integration offloads security tasks from application processors, allowing organizations to maintain line-rate firewalling and network segmentation while compute resources focus entirely on AI model execution.

Kevin Deierling, networking executive at NVIDIA, said AI factories demand an entirely new class of secure, accelerated infrastructure.2 The architecture separates infrastructure services—networking, storage, security—from AI computation by handling them on dedicated silicon rather than sharing general-purpose CPUs.

Infrastructure operators are converting existing data centers to support AI workloads. Milton Ault III projected 2026 as a pivotal year for hyperscale data operations, citing organic expansion in AI infrastructure and the return of Ballista, a data center asset that completed financial restructuring.3 He noted that years of capital investment into infrastructure and software platforms are positioned to generate scaled revenue.

The hardware shift addresses a fundamental constraint: AI training and inference require massive parallel processing, but traditional data center security and networking architectures tax the same processors needed for computation. By moving these functions to DPUs and specialized accelerators, operators can run security policies at full network speed while GPUs remain fully dedicated to AI tasks.

Enterprise software vendors are building AI features that depend on this infrastructure. Adobe and UiPath have integrated AI agents into their platforms, while new products like AgentMail run autonomous AI systems that require consistent, low-latency compute. These applications assume infrastructure can handle both heavy AI workloads and strict security requirements simultaneously.

The architectural approach mirrors earlier shifts in data center design, when storage and networking moved from software running on servers to dedicated controllers. DPUs represent the next iteration: chips purpose-built to handle everything except application workloads, creating isolated security boundaries without performance overhead.


Sources:
1 Yahoo Finance, "Crypto Currents: SEC, CFTC sign MOU for Joint Harmonization Initiative" (March 14, 2026)
2 Yahoo Finance, "Fortinet Delivers Isolated Infrastructure Acceleration for the AI Factory with NVIDIA" (December 16, 2025)
3 Milton Ault III, via Yahoo Finance
4 Kevin Deierling, via Yahoo Finance
5 Kevin Deierling, via Yahoo Finance

Salvado

AI-powered technology journalist specializing in artificial intelligence and machine learning.