I had the honor of attending NVIDIA GTC 2026 alongside our Principal AI Architect, Peet Cremer, our Head of Cognite Atlas AI, Justin Lawyer, and our SVP of Partners & Alliances, Trygve Ronnigen. Between Jensen Huang's visionary keynote and the buzzing energy at the Microsoft Pavilion, one thing is clear: we aren't just talking about "cool AI" anymore - we are talking about AI that works in the physical, messy, and complex world of heavy industry.
Our team spent the week diving deep into the next generation of compute and intelligence. Here are the core insights we’re bringing back from the ground.
Key Takeaways from the GTC Floor
The pace of innovation is accelerating, and NVIDIA's latest announcements are shifting the goalposts for what's possible in industrial AI:
- Security for the Agentic Era: As we move toward autonomous AI agents, security is non-negotiable. NVIDIA's launch of NemoClaw, featuring the OpenShell sandbox and enhanced privacy, is a big step forward in making AI safe for enterprise deployment, though more is needed. Integration of guardrails models at every step of agentic reasoning is one thing I would like to see.
- The Efficiency Revolution: Inference demand is skyrocketing. Reasoning models require 100x more tokens, and adding context pushes that figure another 100x higher. The new Vera Rubin architecture, with its dedicated inference chip, is 35x more efficient for low-latency workloads, proving that "Tokens per Watt" is the new metric for success.
- Closing the Gap on Proprietary Models: With NemoTron 3 Super (and the upcoming Ultra), the performance of open-weight models is now neck-and-neck with the best proprietary systems, giving companies more flexibility in how they build
Beyond LLMs: The Rise of Time Series Foundation Models
While LLMs dominate the headlines, the "silent" backbone of industry is time-series data. Most industrial AI pilots die in the lab because this data is too complex and siloed to scale. At our session, "Transforming Industrial Ops: Real-Time Forecasting and Anomaly Detection," we demonstrated how Cognite is breaking that cycle.
We showcased our collaboration with NVIDIA, integrating the NV-Tesseract Time Series foundation model into the Cogntie Industrial Knowledge Graph.
The Core Insight: The same transformer-based architectures that power Large Language Models can be adapted to monitor and forecast industrial sensors at an unprecedented scale.
Why this matters for the Industry:
- Scaling Across the Enterprise: We move beyond "one model for one pump" to foundational models that can handle thousands of assets simultaneously.
- Context is King: By linking NV-Tesseract to our Industrial Knowledge Graph, we provide the model with the physics and relationship data it needs to be accurate.
- Real-World Impact: This combination allows teams to move from manual "firefighting" to high-value innovation by preventing disruptions before they happen.
Delivering Value Today
We are already moving full steam ahead with our customers to turn these theories into reality:
- Aker BP: Transforming industrial anomaly detection and safeguarding critical assets.
- Celanese: Improving the prediction of reaction water levels in production units, a metric critical to stable and efficient plant operation.
The engagement from the GTC audience was fantastic, specifically regarding how we tune these models for specialized equipment. We're taking that challenge head-on: Cognite is now set to build our own Time Series foundation models, specifically architected for the unique demands of heavy industry.
Watch the Session
Missed us in person? You can catch the full recording of our presentation below to see the benchmarks and real-world customer case studies in action.
Watch: Transforming Industrial Ops with Foundational Time-Series Models
Check the
documentation
Ask the
Community
Take a look
at
Academy
Cognite
Status
Page
Contact
Cognite Support