HomeOur Blog

Unlock AI3.0: Run RAG & Private Compute on Autonomy’s Permanent DSN

Share to Socials

Today we are pleased to release our proof-of-concept demo showcasing the practical application of Autonomys Network’s decentralized triple-stack solution — Permanent Data Storage Network (DSN), Data Access, and Private Compute. The demonstration focuses on an agent running in a Trusted Execution Environment (TEE), leveraging Retrieval-Augmented Generation (RAG) from securely stored data. By providing a secure and efficient framework for privacy-preserving AI deployment, this demo highlights how developers can utilize Autonomys to build scalable, customizable, and human-centric super dApps. As we walk through each section, we will not only provide the timestamps for easy video referencing, but our insights on why it matters.

Introduction (0:00–0:41)

What You’ll See:
This segment introduces the demo, focusing on an agent running securely using a TEE with NVIDIA H100 GPUs. It exemplifies a confidential RAG workflow that prioritizes data privacy.

Why It Matters:
Secure computation through a TEE establishes trust by ensuring sensitive operations are isolated from external interference, enabling privacy-preserving AI development.

Technology Overview (0:41–1:56)

What You’ll See:
Explore the integration of large language models (LLMs), RAG, and vector databases within the secure confines of a TEE. The section highlights the flexibility of deploying these systems locally, in hybrid configurations, or through end-to-end secure setups.

Why It Matters:
This integration provides the foundation for scalable, adaptable AI systems capable of maintaining privacy and security across diverse deployment scenarios.

Core System Configuration (1:57–2:24)

What You’ll See:
The system uses Llama (11 billion parameters), Chroma as the vector database, and open-source embedding from Hugging Face for robust data processing.

Why It Matters:
Open-source tools combined with advanced models democratize access to AI, fostering innovation while reducing reliance on closed, centralized ecosystems.

Attestation and Security Verification (2:25–4:07)

What You’ll See:
This section validates the security of computations through TEE attestation, using verifiable signatures, nonces, and JWT tokens to confirm the integrity of the hardware environment.

Why It Matters:
TEE attestation ensures that computations occur in a private, verified, and tamper-proof environment, a critical component for trust in decentralized AI systems.

Data Encryption and Privacy Workflow (4:08–5:22)

What You’ll See:
Data uploaded to the DSN is encrypted using the TEE’s public key. Decryption occurs within the TEE itself, ensuring that even the system operator cannot access the raw data.

Why It Matters:
This workflow highlights a breakthrough in user-centric privacy, ensuring that sensitive information remains confidential throughout the AI pipeline.

Practical Demonstration: File Upload and Querying (5:23–8:07)

What You’ll See:
This section showcases the upload of a thesis file to the DSN, processed by the TEE, enabling the LLM to answer context-aware queries based on the document.

Why It Matters:
This use case illustrates real-world applications of AI-powered knowledge retrieval while maintaining the highest standards of data security and user privacy.

Web Search and Vector Database Integration (8:08–9:23)

What You’ll See:
Integrated web search results are processed into the vector database, enhancing the contextual depth of the AI system’s understanding and responses.

Why It Matters:
This capability demonstrates how decentralized AI systems can incorporate external data sources for improved functionality without compromising security.

Backup and Encryption Features (9:24–9:57)

What You’ll See:
The vector database’s backup options include encryption and password protection, further securing stored information.

Why It Matters:
Secure backup mechanisms protect against data loss while ensuring continued compliance with privacy standards.

Conclusion

Thank you for exploring our proof-of-concept demo showcasing the capabilities of Autonomys’ Permanent DSN, Data Access, and Private Compute in delivering secure, efficient, and decentralized AI solutions. It is a practical exploration of how privacy, scalability, and accessibility can coexist to support human-centric decentralized AI (deAI) and super dApps.

Are you a founder or builder interested in learning more about how Autonomys Network can support your decentralized solution?