The post Why Verifiable Data Is the Missing Layer in AI: Walrus appeared on BitcoinEthereumNews.com. In brief Decentralized data layer Walrus is aiming to provideThe post Why Verifiable Data Is the Missing Layer in AI: Walrus appeared on BitcoinEthereumNews.com. In brief Decentralized data layer Walrus is aiming to provide

Why Verifiable Data Is the Missing Layer in AI: Walrus

In brief

  • Decentralized data layer Walrus is aiming to provide a “verifiable data foundation for AI workflows” in conjunction with the Sui stack.
  • The Sui stack includes data availability and provenance layer Walrus, offchain environment Nautilus and access control layer Seal.
  • Several AI teams have already chosen Walrus as their verifiable data platform, with Walrus functioning as “the data layer in a much larger AI stack.”

AI models are getting faster, larger, and more capable. But as their outputs begin to shape decisions in finance, healthcare, enterprise software, and beyond, an important question needs to be answered—can we actually verify the data and processes behind those outputs?

“Most AI systems rely on data pipelines that nobody outside the organization can independently verify,” states Rebecca Simmonds, Managing Executive of the Walrus Foundation—a company which supports the development of decentralized data layer Walrus.

As she explains, there is no standard way to confirm where data came from, whether it was tampered with, or what was authorized for use in the pipeline. That gap doesn’t just create compliance risk—it erodes trust in the outputs AI produces.

“It’s about moving from ‘trust us’ to ‘verify this,'” Simmonds said, “and that shift matters most in financial, legal, and regulated environments where auditability isn’t optional.”

Why centralized logs aren’t enough

Many AI deployments today rely on centralized infrastructure and internal audit logs. While these can provide some visibility, they still require trust in the entity running the system.

External stakeholders have no choice but to trust that the records haven’t been altered. With a decentralized data layer, integrity is anchored cryptographically, so independent parties can verify them without relying on a single operator.

This is where Walrus positions itself, as the data foundation within a broader architecture referred to as the Sui Stack. Sui itself is a layer-1 blockchain network that records policy events and receipts onchain, coordinating access and logging verifiable activity across the stack.

The Sui Stack. Image: Walrus

“Walrus is the data availability and provenance layer—where each dataset gets a unique ID derived from its contents,” Simmonds explained. “If the data changes by even a single byte, the ID changes. That makes it possible to verify that the data in a pipeline is exactly what it claims to be, hasn’t been altered, and remains available.”

Other components of the Sui Stack build on that foundation. Nautilus lets developers run AI workloads in a secure offchain environment and generate proofs that can be checked onchain, while Seal handles access control, letting teams define and enforce who can see or decrypt data, and under what conditions.

“Sui then ties everything together by recording the rules and proofs onchain,” Simmonds said “That gives developers, auditors, and users a shared record they can independently check.”

“No single layer solves the full AI trust problem,” she added. “But together, they form something important: a verifiable data foundation for AI workflows—data with provable provenance, access you can enforce, computation you can attest to, and an immutable record of how everything was used.”

Several AI teams have already chosen Walrus as their verifiable data platform, Simmonds said, including open-source AI agent platform elizaOS, and blockchain-native AI intelligence platform Zark Lab.

Verifiable, not infallible

The phrase “verifiable AI” can sound ambitious. But Simmonds is careful about what it does—and doesn’t—imply.

“Verifiable AI doesn’t explain how a model reasons or guarantee the truth of its outputs,” she said. But it can “anchor workflows to datasets with provable provenance, integrity, and availability.” Instead of relying on vendor claims, she explained, teams can point to a cryptographic record of what data was available and authorized. When data is stored with content-derived identifiers, every modification produces a new, traceable version—allowing independent parties to confirm what inputs were used and how they were handled.

This distinction is crucial. Verifiability isn’t about promising perfect results. It’s about making the lifecycle of data—how it was stored, accessed, and modified—transparent and auditable. And as AI systems move into regulated or high-stakes environments, this transparency becomes increasingly important.

“Finance is a pressing use case,” Simmonds said, where “small data errors” can turn into real losses thanks to opaque data pipelines.“Being able to prove data provenance and integrity across those pipelines is a meaningful step toward the kind of trust these systems demand,” she said, adding that it “isn’t limited to finance. Any domain where decisions have consequences— healthcare, legal—benefits from infrastructure that can show what data was available and authorized.”

A practical starting point

For teams interested in experimenting with verifiable infrastructure, Simmonds suggests starting with the data layer as a “first step” rather than attempting a wholesale overhaul.

“Many AI deployments rely on centralized storage that’s really difficult for external stakeholders to independently audit,” she said. “By moving critical datasets onto content-addressed storage like Walrus, organizations can establish verifiable data provenance and availability—which is the foundation everything else builds on.”

In the coming year, one of the focuses for Walrus is expanding the partners and builders on the platform. “Some of the most exciting stuff is what we’re seeing developers build—from decentralized AI agent memory systems to new tools for prototyping and publishing on verifiable infrastructure,” she said. “In many ways, the community is leading the charge, organically.”

“We see Walrus as the data layer in a much larger AI stack,” Simmonds added. “We’re not trying to be the whole answer—we’re building the verifiable foundation that the rest of the stack depends on. When that layer is right, new kinds of AI workflows become possible.”

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.

Source: https://decrypt.co/358431/why-verifiable-data-is-the-missing-layer-in-ai-walrus

Market Opportunity
Solayer Logo
Solayer Price(LAYER)
$0.08367
$0.08367$0.08367
+1.70%
USD
Solayer (LAYER) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

You May Also Like

China Blocks Nvidia’s RTX Pro 6000D as Local Chips Rise

China Blocks Nvidia’s RTX Pro 6000D as Local Chips Rise

The post China Blocks Nvidia’s RTX Pro 6000D as Local Chips Rise appeared on BitcoinEthereumNews.com. China Blocks Nvidia’s RTX Pro 6000D as Local Chips Rise China’s internet regulator has ordered the country’s biggest technology firms, including Alibaba and ByteDance, to stop purchasing Nvidia’s RTX Pro 6000D GPUs. According to the Financial Times, the move shuts down the last major channel for mass supplies of American chips to the Chinese market. Why Beijing Halted Nvidia Purchases Chinese companies had planned to buy tens of thousands of RTX Pro 6000D accelerators and had already begun testing them in servers. But regulators intervened, halting the purchases and signaling stricter controls than earlier measures placed on Nvidia’s H20 chip. Image: Nvidia An audit compared Huawei and Cambricon processors, along with chips developed by Alibaba and Baidu, against Nvidia’s export-approved products. Regulators concluded that Chinese chips had reached performance levels comparable to the restricted U.S. models. This assessment pushed authorities to advise firms to rely more heavily on domestic processors, further tightening Nvidia’s already limited position in China. China’s Drive Toward Tech Independence The decision highlights Beijing’s focus on import substitution — developing self-sufficient chip production to reduce reliance on U.S. supplies. “The signal is now clear: all attention is focused on building a domestic ecosystem,” said a representative of a leading Chinese tech company. Nvidia had unveiled the RTX Pro 6000D in July 2025 during CEO Jensen Huang’s visit to Beijing, in an attempt to keep a foothold in China after Washington restricted exports of its most advanced chips. But momentum is shifting. Industry sources told the Financial Times that Chinese manufacturers plan to triple AI chip production next year to meet growing demand. They believe “domestic supply will now be sufficient without Nvidia.” What It Means for the Future With Huawei, Cambricon, Alibaba, and Baidu stepping up, China is positioning itself for long-term technological independence. Nvidia, meanwhile, faces…
Share
BitcoinEthereumNews2025/09/18 01:37
CEO Statement: Natrium Reactor Accepted into UK Regulatory Process

CEO Statement: Natrium Reactor Accepted into UK Regulatory Process

BELLEVUE, Wash., Feb. 19, 2026 /PRNewswire/ — TerraPower President and CEO, Chris Levesque, issued a statement on the acceptance of the Natrium® reactor 1 into
Share
AI Journal2026/02/20 02:31
CME Launches 24/7 Crypto Futures Trading

CME Launches 24/7 Crypto Futures Trading

The post CME Launches 24/7 Crypto Futures Trading appeared on BitcoinEthereumNews.com. CME Group, the world’s largest derivatives exchange, announced it will start
Share
BitcoinEthereumNews2026/02/20 01:54