Live Stream With Steve - Hear From Dev & OOBE Core (also @ pump.fun)

The Spaces focused on a live walkthrough of Synapse, a low‑latency RPC and WebSocket gateway for Solana built by Steve and the UB team, with Sulking hosting and Alex moderating. Steve demoed the playground and dashboard (requests, latency, cache hit‑rate), an RPC Explorer, and SDK modules nearing completion (Smart Cache, Circuit Breaker, Load Balancer, DeFi components). A new batch executor reduces latency and saves quotas across subscription tiers, and WebSocket subscriptions are supported (WS/WSS to be confirmed). In Q&A, Wi‑Fi compared Synapse’s speed favorably to Helius; Steve attributed performance to sophisticated load‑balancing, dedicated non‑archival nodes, and an optimized caching layer, with Old Faithful and gRPC integrations coming for historical and high‑throughput methods. Documentation is live and updated per commit. Pricing will be flexible and more affordable than competitors, with a free tier and higher tiers unlocking more methods. Marketing will center on case studies from testers (e.g., Degen AI), incentives for early adopters, and word‑of‑mouth. The team is formalizing a company in Italy, modernizing brand (V3), recruiting community managers, and targeting an early‑November public release.

Synapse RPC Twitter Spaces + Livestream Recap

Participants and Roles

  • Sulking — host; community/marketing lead, drives go-to-market and partnerships
  • Alex — co-host; brand and marketing lead (V3 brand revamp), community engagement
  • Steve — lead developer/infra; architect and presenter of Synapse RPC, SDK, and infra roadmap
  • WiFi — external developer/tester (partner team); provided hands-on feedback and questions
  • Jim — partnerships/community ops; tester coordination, partner relations, recruiting

What Synapse RPC Is and How It’s Built

  • Positioning: Synapse is a Solana RPC and WebSocket gateway designed for low-latency, resilient access to JSON-RPC and derived data (including upcoming gRPC-backed methods). It sits between the client and a pool of upstream nodes (their own validator plus third-party providers).
  • Objective: Deliver fast, reliable reads/writes with intelligent load balancing and caching, plus developer-friendly tooling (SDK, docs, dashboards) and tight integration with UBI’s on-chain agent stack.

Core Components Demonstrated

  • Playground/Test Dashboard (open-sourced post-stream):
    • Visualizes metrics such as total requests, average latency, and cache hit rate (uptime panel present but disabled during testing).
    • Acts as a self-serve sandbox to try features/methods pre-purchase; will require an API key.
  • SDK Modules (in or nearing late-stage development):
    • Smart Cache: multi-layer caching to reduce latency and load.
    • Circuit Breaker: resilience against upstream instability.
    • Load Balancer: intelligent routing across multiple nodes.
    • DeFi components: specialized methods/utilities (in progress).
  • RPC Explorer UI: Try RPC methods with default/custom params and inspect response/latency. On-stream examples for getAccountInfo and getMultipleAccounts showed low-millisecond responses.

New Features and Developer Experience

Batch Method Execution (Quota and Latency Savings)

  • Purpose: Combine multiple method calls (e.g., 50 getBalance calls) into a single batched operation.
  • Benefits:
    • Significant quota savings: up to ~60% on Enterprise, ~25% on Advanced, ~10% on Free (illustrative figures shared).
    • Lower end-to-end latency by resolving dependent method outputs within one batch executor when supported.
  • Availability: Documented in the SDK and JSDoc site; certain methods are batch-execution aware.

WebSocket Support

  • WebSocket tab in the app allows subscribing to programs/updates with very fast push latency.
  • Protocols: WS is supported; WSS expected to be supported (Steve to confirm with the teammate who implemented WS server).
  • Status: Available to beta testers; WiFi confirmed he will integrate and report results.

gRPC/Geyser and “Old Faithful” Integration

  • Geyser RPC plugin integration is underway (expected to land between today and tomorrow from the time of the stream) to support fast, streaming-like access and “reply” methods (examples mentioned: getTransaction-for-address, getSlot).
  • “Old Faithful” plugin: Planned to offload archival-type queries without hosting full archival storage on a single node, distributing historical data queries more efficiently across a cluster.
  • Why others charge $1k+/mo for gRPC: Infrastructure scale and cost — archival/historical storage, cluster management, and maintaining high-throughput servers/VPS.
  • Pricing stance: UBI will avoid competitor-level pricing; gRPC access will be included across tiers in a limited way, with broader access on higher plans.

Architecture and Performance

  • Load balancing: Routes calls across a pool of upstream nodes, including UBI-run validator(s) and third-party RPC providers, tuned to minimize contention and hop cost.
  • Dedicated nodes without archival payload: Keeps hot paths fast by avoiding heavy indexing layers for standard methods.
  • Multi-layer caching: Custom memory layers designed to return frequent reads rapidly and reduce round-trips to upstream sources.
  • Upcoming infra updates: Clustered approach with “Old Faithful” to handle archival-like and heavy historical queries more efficiently, and gRPC for reply methods to speed up high-frequency data flows.

Agent Integration (On-Chain Memory)

  • Synapse is natively aligned with UBI’s on-chain agent system:
    • Persist and fetch agent memory (sessions, tool results) on-chain quickly (targeting sub-second fetch) using:
      • PDA Manager (in Protocol SDK)
      • Zero-Combine Fetcher
      • Memory collaboration classes/utilities
  • Value: Agents can rely on immutable data on-chain and remain in-context without expensive rehydration; SDK abstracts the plumbing.

Documentation

  • Two layers of docs:
    • Getting-started guide for Synapse client.
    • Full JavaScript/JSDoc site (linked from GitHub) describing classes, methods, parameters, return types, usage, and possible errors.
  • Maintenance: Auto-updated on commits/PRs so developers always see current APIs.

Pricing and Tiers

  • Tiers: Free, Basic, Advanced, Enterprise; higher tiers unlock more methods/features and larger quotas.
  • Batch savings differ by tier (noted above). Some advanced features (e.g., broader gRPC method set) will be tier-gated.
  • Strategy: Be more affordable than leading competitors (Helius called out explicitly) while delivering superior performance/feature depth.
  • Timeline: Final, flexible pricing model to be published in ~1–2 weeks from the time of the stream.

Field Feedback and Testing Highlights

  • Degen AI (DJen AI):
    • Stress-tested Synapse to failure and could not “kill” it; requested a batch feature that Steve implemented in about 5 hours.
    • Reported positive results and intends to switch upon full release.
  • WiFi (Nova):
    • Switched a production workload from Helius to UBI’s beta and saw immediate speed/latency gains.
    • A persistent portfolio balance timing issue (zero balance edge case) disappeared with Synapse, indicating prior node latency was the root cause rather than application bugs.
    • Confirmed trading flows worked reliably and felt faster.
  • Additional testers: IQ, Pai, and others onboarded; broader tester pool deliberately extended to gather diverse, high-value insights across different use cases.
  • General approach: “Community-driven RPC” — rapid implementation of tester suggestions, fast iteration cycle, and open feedback loop via forms and live sessions.

Availability and Release Timeline

  • Test dashboard/playground: Repository to be published immediately post-stream under the team’s GitHub org; accessible to anyone with an API key.
  • Beta: Ongoing; intentionally extended to capture broader use-case data, not due to blockers.
  • Public release: Targeting early November (features like improved batching, gRPC/Geyser, and Old Faithful integration rolling out as they mature).
  • Next demo: Planned showcase of Synapse SDK integrated into the broader agent SDK, including a live AI agent demo on Synapse RPC.

Go-To-Market, Brand, and Company Formation

Marketing Strategy

  • Product-first marketing: Perfect the product via beta testers, then ship case studies and proof points.
  • Early adopter incentives: Discounts for projects that switch to Synapse RPC at launch; requests for “powered by Synapse” credits and public testimonials.
  • Channel strategy: Word-of-mouth from respected utility projects on Solana to drive adoption; follow with broader campaigns once case studies are live.

Brand V3

  • Modernization (not a full overhaul like V2):
    • More commercial polish and recognizability while avoiding overly minimal/corporate aesthetics.
    • Animated content suitable for ads (e.g., Google Ads), standardized post templates on X to distinguish major releases vs. side updates.
    • Goal: Appeal to both developers and traders while maintaining UBI’s identity.

Company Formation and Team

  • Incorporation: Company formation underway in Italy with accountant and notary; moving quickly to formalize.
  • Recruiting: Hiring two community/ops members; increasing activity on platforms like TikTok; restructuring roles as revenue ramps so contributors are formally paid.

Pricing Philosophy and Accessibility

  • Affordability: Commitment to beat competitor pricing for comparable or better performance; provide essential gRPC access across tiers in a limited fashion; advanced access in higher tiers.
  • Fit-by-need: Cheaper plans for users who don’t need advanced methods; higher tiers for power users and enterprise workloads.

Community Notes

  • Alex urged the community to stay engaged (raids, coordinated participation) and framed the current valuation as undervalued relative to tech progress; this reflects his personal market viewpoint and motivational messaging to the community.

Key Takeaways

  • Synapse RPC emphasizes speed, resilience, and developer ergonomics:
    • Intelligent load balancing + multi-layer caching
    • Batch execution (substantial quota and latency savings)
    • WebSocket support; gRPC/Geyser and Old Faithful integrations imminent
    • Deep integration with UBI’s on-chain agent stack for sub-second memory fetch/persist
  • Strong early tester feedback from diverse projects; rapid feature turnaround (batching landed in ~5 hours after request).
  • Open-source playground and live docs lower adoption friction; public release targeted for early November.
  • Pricing to be flexible and competitive, with tiered access for advanced features; case studies and incentives planned to drive early conversions.
  • Company formalization in Italy and brand V3 signal a move from “crypto project” to a professional, commercial-ready provider.

Immediate Next Steps

  • Publish the test dashboard/playground repo post-stream; enable self-serve testing with API keys (free tier available, slower rate limits).
  • Finalize and publish pricing (~1–2 weeks); keep docs auto-updated with new commits.
  • Roll out gRPC/Geyser and Old Faithful support for reply/historical methods; continue infra tuning.
  • Prepare case studies from early adopters; launch marketing pushes with incentives.
  • Schedule the next livestream to demonstrate Synapse SDK + agent integration.