top of page
shutterstock_2469874647.png

Dell Pro Max with GB10

Secure, on-premises AI system purpose-built for the U.S. Courts

The Dell Pro Max with GB10 is a compact, high-performance AI appliance designed to help U.S. Courts securely develop, fine-tune, and deploy advanced AI models on-premises — keeping confidential case data within the judiciary’s control while accelerating legal research, document analysis, and workflow automation.

It runs Linux (NVIDIA DGX OS) with the full NVIDIA AI software stack, enabling teams to seamlessly move models from desktop development to court data center or approved cloud when appropriate — with minimal code changes.

Data Center-Class AI Performance at the Desk

 

  • Powered by the NVIDIA GB10 Grace CPU (10 Cortex-X925 + 10 Cortex-A725 cores) and NVIDIA GB10 Blackwell GPU. 

  • 128 GB LPDDR5X unified memory and ultra-high memory bandwidth (273 GB/s) to load very large models locally.

  • Supports loading a 200 billion parameter model into memory per unit, and by interconnecting two GB10 systems, supports up to 400 billion parameters. 

  • Up to ~1 petaflop (1,000 TFLOPS) of FP4 compute in a compact form. Dell

  • Pre-configured system: DGX OS, full NVIDIA AI software stack, ready to deploy at desk-side or in an edge environment.

image.png
cq5dam.thumbnail.319.319.png

Why GB10 Fits the U.S. Courts Environment

 

  • Local AI processing: Maintain full control of confidential case data and court records without sending sensitive data to the cloud.

  • Regulatory alignment: Designed for environments with strict data-privacy, compliance, and sovereignty requirements — such as government, legal, and judicial workflows.

  • Compact, power-efficient deployment: Unlike large rack systems, the GB10 system enables desk-side or secure office deployments in space-constrained environments. 

  • Future-ready scalability: Because it uses the same NVIDIA stack used in data centers, work done locally on a GB10 can scale up (e.g., court-data-center or approved cloud) with minimal code or workflow changes. 

Key Advantages for the U.S. Courts

 

  • Secure local AI development on confidential judiciary datasets — eliminating cloud-dependency risks.

  • High-performance architecture optimized for large-model AI workloads in a compact form factor.

  • Scalable, low-latency interconnect design: Two units can be linked via QSFP / smart-NIC to tackle larger parameter models when needed. Dell+1

  • Rapid time-to-value: Pre-configured software stack, containerized workflows, and NVIDIA-validated ecosystem let your team focus on models/issues — not infrastructure setup.

shutterstock_2631056853_edited.jpg

Use Cases for Judicial & Administrative Teams

Case Document & Precedent Analysis

 

Use large language models and vision models for summarization, precedent research and legal text comparison — securely on-premises.

Administrative Automation

​

Support court clerks and administrative staff with workflow automation, routing of filings, extraction of data from court documents, and secure inferencing on internal systems.

Evidence & Discovery Review

​

Enable fast, local review of large data sets of filings, transcripts, exhibits with AI-enabled indexing and retrieval — while preserving data control.

Judicial Research & Decision Support

​

Develop specialized AI tools for judges or legal researchers using natural-language search, classification, summarization — all within a secure, local environment where data never leaves control.

shutterstock_2486161465_edited.jpg

If your court is exploring how to modernize operations with secure, on-premises AI, MCP Computer Products can assist.

bottom of page