AI News
Oracle ships quantum-resistant database with in-place AI agents—and a $1.5B partner bet
Oracle's betting AI agents run where data lives, not where clouds want them. The company just shipped quantum-resistant encryption across its database stack and drew $1.5 billion in partner commitments—before the platform hit GA.
Oracle turns the database from storage into runtime, and makes a pre-quantum security play.
Oracle has released a major database update that runs AI agents inside transactional systems and adopts post-quantum encryption end-to-end. The company says the new Oracle AI Database 26ai, unveiled on October 14, brings agentic workflows into the core engine and implements NIST-approved ML-KEM for data in transit and at rest—paired with a parallel launch of an AI Data Platform backed by $1.5 billion in partner investments and training pledges from Accenture, Cognizant, KPMG, and PwC. For customers on 23ai, Oracle says the October release update unlocks 26ai features without a full upgrade. It’s available across OCI, AWS, Azure, and Google Cloud.
Key Takeaways
• Oracle AI Database 26ai runs AI agents inside transactional systems with NIST-approved quantum-resistant encryption across network and storage layers.
• Partners committed $1.5 billion to train 8,000 professionals and build 100-plus use cases before general availability—signaling enterprise demand.
• Autonomous AI Lakehouse supports Apache Iceberg across all four hyperscalers, letting Oracle control data layers while clouds provide infrastructure.
• No database upgrade required for 23ai customers; advanced AI features like vector search carry no additional licensing costs.
What’s actually new
Start with the crypto. Oracle is rolling out ML-KEM (a NIST-standardized algorithm) across both network transport and data storage in its database stack. The company’s claim: rivals have implemented post-quantum cryptography in one layer or the other, but not both at once. That matters because “harvest now, decrypt later” isn’t fiction; state actors can capture encrypted data today and wait for practical quantum attacks tomorrow. Oracle wants that window closed.
Then the runtime shift. Instead of bolting vector search onto a data store and shipping documents to a separate AI service, 26ai moves the agent scheduler to where transactions happen. Agents can reason iteratively against live records, invoke tools, and return actions without shuffling petabytes through brittle ETL chains. Less movement means lower latency, fewer governance headaches, and fewer copies of sensitive data. It also means the database team is suddenly in the AI loop.
How it works in practice
Oracle says 26ai supports the Model Context Protocol (MCP) so agents can query enterprise data with context and request more as they reason. A no-code builder lets teams compose multi-step, agentic workflows using prebuilt components. Under the covers, Exadata offloads vector operations to its smart storage, and an Exascale architecture brings elastic scaling to smaller footprints. The security posture is opinionated: row-, column-, and cell-level visibility; dynamic masking; and a database-native SQL firewall to flag and block suspicious activity.
📰
We read 100 articles. You read one email.
Five minutes. Zero BS. Daily AI news.
There’s also a laundry list for CIOs who care about resilience. A zero-data-loss recovery service targets ransomware risks. A globally distributed database option promises sub-three-second failover using RAFT-based replication. And a mid-tier “True Cache” aims to keep SQL, vector, JSON, spatial, and graph queries consistent without developer-managed cache code. None of this is flashy. It is the kind of plumbing enterprises measure.
The multi-cloud calculus
Oracle’s Autonomous AI Lakehouse now runs on all four hyperscalers and supports Apache Iceberg. Interoperability with Databricks and Snowflake in the same clouds lets customers keep existing pipelines while tapping Exadata performance and serverless, pay-per-use scaling. In strategy terms, this is infrastructure arbitrage: let AWS, Azure, and Google sell compute and storage while Oracle owns the high-trust data layer that enterprises won’t rewrite every budget cycle. The openness is real on formats and protocols. The performance moat remains very much proprietary.
Follow the money, not the demo
The partner numbers go beyond press-tour enthusiasm. Accenture, Cognizant, KPMG, and PwC have collectively pledged more than $1.5 billion to train 8,000-plus practitioners and prebuild 100-plus industry use cases tied to Oracle’s AI Data Platform. Systems integrators don’t staff thousands of people on a whim; they do it when clients are funding transformations with near-term deliverables. If those benches fill, it’s a real signal of demand.
Enterprise scorecard: what to watch
Three things over the next 12 months will show whether Oracle’s bet pays off. First, paid implementations of the AI Data Platform beyond pilots; that’s the revenue test. Second, how many 23ai customers switch on 26ai via the release update without breaking app certifications; that’s the friction test. Third, Autonomous AI Lakehouse adoption on non-Oracle clouds; that’s the platform-control test. If all three move, Oracle will have turned the database into AI infrastructure, not just AI’s data source.
Caveats and trade-offs
Running agents inside the database shortens the path to action, but it centralizes risk. Guardrails, auditability, and rate-limiting need to be first-class features, not slideware. Post-quantum crypto adds overhead; performance claims should be validated under real workloads. And “open” stops at the edge of Exadata economics. You’ll get Iceberg, MCP, and REST, but the best numbers will almost certainly depend on Oracle’s own stack.
Why this matters
- Quantum-resistant encryption deployed now counters “harvest-and-decrypt” risks before large-scale quantum attacks arrive.
- Treating the database as the AI runtime flips ETL economics: agents act where data lives, cutting cost, latency, and exposure.
❓ Frequently Asked Questions
Q: What is ML-KEM and why does quantum resistance matter before quantum computers exist?
A: ML-KEM (Module-Lattice-Based Key Encapsulation Mechanism) is a NIST-approved algorithm designed to resist quantum computer attacks. It matters now because adversaries can capture encrypted data today and store it until quantum computers powerful enough to decrypt it become available—potentially within 5-10 years. Oracle's encrypting both data in transit and at rest to close that window.
Q: How does upgrading from 23ai to 26ai work without breaking existing applications?
A: Customers apply Oracle's October 2025 release update—essentially a patch, not a full version upgrade. The database schema, API contracts, and application interfaces remain unchanged, so certified apps continue running without recertification. New AI features like vector search and agentic workflows become available but don't activate unless explicitly configured. No data migration required.
Q: What's the practical difference between running AI agents in-database versus traditional cloud AI services?
A: Traditional approaches copy data from production databases to separate AI platforms via ETL pipelines—creating latency, governance gaps, and multiple data copies. In-database agents query live transactional data directly, reason iteratively, and return actions without moving petabytes. For a 10TB customer database, that's the difference between hours of ETL and sub-second queries with no data duplication.
Q: What does the $1.5 billion partner investment actually buy?
A: The $1.5 billion funds training 8,000+ consultants, data engineers, and architects across Accenture, Cognizant, KPMG, and PwC. It also covers development of 100+ pre-built industry use cases—templates for finance, healthcare, supply chain, and other sectors that customers can deploy rather than starting from scratch. Systems integrators don't staff at that scale without client demand pipelines.
Q: Does quantum-resistant encryption slow down database performance?
A: Post-quantum algorithms like ML-KEM do add computational overhead compared to current RSA or elliptic curve methods—typically 10-30% for key generation and encryption operations. Oracle hasn't published specific benchmarks yet. The performance impact matters most for high-frequency transaction systems; batch analytics workloads won't notice. Enterprises will need to test under actual workloads before deployment.