Secret Network: Privacy at the Frontier - From Confidential Smart Contracts to Confidential AI

The Spaces explores how Secret Network is evolving from a privacy-preserving smart contract L1 into a chain-agnostic privacy hub for Web3 and AI. Host Paul Speaks interviews Luke Bowman (COO, Secret Network Foundation), who traces his path from 2017 trader to privacy advocate, arguing that self-sovereignty is impossible without privacy. Bowman outlines the shift from monolithic L1 to interoperable SDKs, enabling developers on any chain—and even Web2—to add confidential execution. The conversation details Secret AI and Secret Virtual Machines (VMs), which run models and applications inside trusted execution environments (TEEs) with cryptographic attestation, replacing “trust me” with “verify me.” They discuss adoption paths (institutions first, then retail), the need for privacy-by-default AI akin to HTTPS, and the danger of bad regulation that entrenches gatekeepers. Q1 2026 milestones include AMD SEV support (alongside Intel TDX), on-demand seed rotation, and pending semi-permissioned nodes as part of a post-2025 physical TEE attack response and a shift toward “Proof of Cloud.” Success for 2026 centers on real usage: onboarding enterprises, spinning up apps quickly via containerized Secret VMs, and running a free compute + support campaign for builders.

Cryptic Talks: Privacy, AI, and Secret Network’s Roadmap

Participants and Context

  • Host: Paul Speaks (Cryptic Talks)
  • Guest: Luke Bowman (COO, Secret Network Foundation)
  • Brief context: Session began with technical difficulties typical of X/Twitter Spaces. The conversation focused on privacy infrastructure, confidential computing, decentralized AI, and Secret Network’s 2026 roadmap and execution progress.

Core Theme: Privacy as Foundational Infrastructure for Web3 and AI

  • Paul’s framing: Privacy must evolve beyond the false trade-off with transparency; the future requires confidentiality without sacrificing verifiability. As AI accelerates, the infrastructure must protect data while enabling trust—privacy as invisible, default infrastructure akin to how HTTPS became standard.
  • Luke’s thesis: Blockchain’s “fatal flaw” is default public transparency. Genuine self-sovereignty and civil liberty require privacy as a core primitive; otherwise AI turns users into an even more exploitable product. Both privacy and verifiability must be solved together.

Secret Network’s Evolution and Identity

  • Mission continuity:
    • Secret launched (2020) with privacy-preserving smart contracts—first of its kind on mainnet, beyond transactional privacy (e.g., Zcash, Monero) to private logic and state.
    • Core identity: A practical hub for confidential computing that builders can integrate to protect user data and enable new use cases.
  • Architectural shift (circa 2022 onward):
    • From building everything on a single L1 to a modular, chain-agnostic privacy layer.
    • SDKs/tools to integrate privacy regardless of base chain (Ethereum, Solana, L2s), minimizing migrations and making adoption feel natural.
    • Strategic aim: Unlock institutional adoption by protecting proprietary strategies and sensitive information.
  • Expansion into AI (2024–2025):
    • Secret AI previewed late 2024; 2025 focused on building it and its extension: Secret Virtual Machines (Secret VM).
    • Objective: Become the privacy hub for both Web3 and AI—confidential execution plus cryptographic attestation for decentralized/intelligent systems.

Confidential Computing, Secret AI, and Verifiability

  • Technical model:
    • Secret AI runs LLMs inside confidential virtual machines backed by Trusted Execution Environments (TEEs).
    • TEEs provide:
      • Data protection at the hardware level (encryption-at-compute).
      • Attestation: Cryptographic proofs that the claimed code is exactly what is running—moving from “trust me” to “verify me.”
  • Hardware support and options:
    • Intel TDX (enterprise-grade TEEs) was the initial stack.
    • AMD SEV added in Q1 2026, expanding options for workload-specific tradeoffs.
    • Nvidia is expected to introduce additional relevant chips; current production usage centers on Intel TDX and AMD SEV.
  • Developer UX:
    • Secret VM emphasizes low friction: containerized applications (roughly 99% of apps) can be launched quickly, often within a day, without extensive refactoring.
    • Enables privacy-preserving logic, state confidentiality, and attestable execution.

Adoption Paths: Builders, Institutions, and Retail

  • Luke’s view:
    • Near-term adoption led by builders and institutions with distribution capacity; killer apps will onboard retail at scale.
    • Retail also benefits now: users can spin up personal confidential agents using open-source models (e.g., DeepSeek variants) with verifiability and data protection.
  • Paul’s perspective:
    • Lowering builder barriers and embedding privacy as an invisible layer accelerates ecosystem growth.
    • If privacy becomes standard and unobtrusive, users get intelligence without exposure.

Privacy-Preserving AI as the New Default

  • Luke’s fork-in-the-road framing:
    • Utopia: AI amplifies human efficiency and abundance while preserving self-sovereignty and civil liberties.
    • Dystopia: A pervasive AI surveillance state, centralized data monopolies, and systemic vulnerabilities.
    • Outcome depends on early, correct adoption of privacy plus verifiability at the foundational layer.
  • Paul’s analogy:
    • Privacy-preserving AI should become as expected as HTTPS—moving from optional security to ubiquitous baseline.

Biggest Risk: Bad Regulation

  • Luke’s critique:
    • Poorly informed regulation (despite good intentions) pushes familiar but flawed centralization models—gatekeepers holding all data.
    • Foundational protections via confidential computing (TEEs + attestation) are superior to relying on corporate promises.
    • Example concern: Terms of major AI platforms that claim broad rights over user-provided content, potentially including creative output (e.g., patent prep). This centralization risks ownership and autonomy.
  • Desired approach:
    • Regulation that recognizes and enables verifiable, decentralized privacy tech rather than entrenching centralized data custodians.

Q1 2026: Milestones and Security Posture

  • Delivered:
    • AMD SEV support: Expanded TEEs beyond Intel TDX, giving builders more hardware optionality to match workloads.
    • On-demand seed rotation: Key/secret rotation capability reduces long-term exposure risk—akin to regularly changing locks in secure systems.
  • Pending (on track):
    • Semi-permissioned nodes with “proof-of-cloud”:
      • Background: A Q4 2025 physical attack demonstrated that owners with physical access to certain chips could extract secrets.
      • Immediate response: Temporarily shifted from fully permissionless to permissioned confidential computing to mitigate risk.
      • Path forward: Semi-permissioned model that verifies secure physical placement (e.g., hardened data centers), and proves that any access (maintenance, etc.) was non-malicious.
      • Emphasis: Maintain verifiability of protective claims—security must be demonstrable, not assumed.

2026 Success Criteria

  • Luke’s goals:
    • Transition from building to usage: Drive adoption of Secret AI and Secret VM, particularly among large enterprises.
    • Aggressive builder outreach:
      • Offer free compute and hands-on support to onboard applications to mainnet.
      • Make “don’t trust, verify” a practical user promise—shift trust architecture across the stack.
  • Market context:
    • Regardless of macro cycles (Paul mentioned a personal Bitcoin sentiment call to ~30k), the focus remains relentless builder engagement and deployment.

Host’s Synthesis and Perspective

  • Paul’s takeaways:
    • Privacy isn’t anti-compliance or anti-innovation—it enables both when designed for verifiable confidentiality.
    • The next wave of blockchain adoption hinges as much on trust architecture as on speed/scalability.
    • Secret Network’s trajectory exemplifies moving privacy from abstract ideals to real execution: seed rotation, hardware enclaves, governance reform, confidential AI—all being deployed.

Actionable Guidance and Calls to Action

  • Luke’s advice to users:
    • Treat personal privacy as essential—be mindful of location sharing, social media oversharing, and on-chain leakage (especially in DeFi).
  • Luke’s advice to builders:
    • Design for user privacy from the start, especially on crypto rails.
    • Secret Network Foundation can help—whether or not teams ultimately use its stack—by advising on available tools and best-fit technologies.
  • Builder offerings:
    • Free compute and onboarding support for teams integrating Secret VM/AI into production.

Highlights and Notable Quotes (Paraphrased)

  • “Blockchain’s fatal flaw is public-by-default transparency; true self-sovereignty demands privacy.” — Luke
  • “Don’t trust, verify: attestation proves the code running is the code claimed.” — Luke
  • “Without privacy, AI turns people into an even more exploitable product.” — Luke
  • “Privacy-preserving AI should become default infrastructure, like HTTPS.” — Paul
  • “Bad regulation entrenches gatekeepers; foundational confidential computing offers a better path.” — Luke

Closing

  • Conversation balanced philosophy and technical execution. Secret Network positions privacy as a practical, verifiable layer for Web3 and AI, evolving from a privacy-first L1 to a modular, chain-agnostic confidentiality hub. Q1 progress strengthens hardware diversity and operational security, while 2026 focuses on enterprise and builder adoption—making verifiable privacy a lived reality across applications.