For blockchains, transparency is a blessing and a curse. Transparency is one of the primary reasons people trust blockchains. However, the tradeoff has been a near-total lack of privacy, which is problematic, to say the least. Consider: If the data on every app on your phone were fully public, you probably wouldn’t use half of them or at least think hard about doing it.
In this way, transparency is a constraint against what’s possible onchain. For crypto to fulfill its potential, it’s necessary to create a world where transparency and privacy coexist. The good news is this has been worked on for a decade, and solutions are maturing. Many areas that require privacy—from healthcare to DeSci—are within striking distance.
A few other important use cases:
- Private DeFi: Privacy addresses many problems that current DeFi users face. That is everything from targeted phishing attacks, front-running, and MEV to the higher risk of loan liquidation when thresholds are not hidden. Privacy also enables discreet trading spaces for whales, also known as dark pools, which prevent violent price fluctuations and add protection for users.
- Private Voting: Privacy is a hallmark of democratic processes globally, and it’s equally important for onchain decisions. Privacy is a guardrail against vote buying, coercion, and other forms of bias. And with privacy-preserving tools (like BlockVote) coming to market, it’s possible to protect voter identities and decisions, as well as fine-tune who gets to vote and how votes are weighted.
- Private AI: Privacy is a critical need for AI. This is true in terms of training; some companies are stuck using public datasets due to the inability to train AI models securely using proprietary, high-value data. It’s also true of agents. In a future where personalized agents transact onchain for us, privacy guarantees will be needed to protect the inference, policies, prompts, etc.
Onchain Privacy Solutions
Due to their decentralized nature, enabling privacy on a blockchain is not simple. In web2, centralized entities and their data centers are trusted to keep user data, but this model comes with many pitfalls and feeds pervasive phenomena like surveillance capitalism and other forms of exploitation.
There are various ways of achieving privacy for Web3 users. Asset-specific privacy was one the earliest methods, including privacy coins like ZCash or Monero, but more recently, this has expanded to include dApps/smart contracts with private state, enabling confidential interactions with more complexity.
Generally speaking, this evolution has centered around a few key technologies.
Zero-Knowledge Proofs (ZKPs)
In their simplest form, zero-knowledge proofs are a cryptographic method that allows one party to demonstrate the truth of a statement to another party without revealing the data itself. ZKPs have different use cases within crypto, but one of their main benefits is this ability to keep data private within a blockchain system. This is possible thanks to advancements in cryptographic algorithms and mathematical concepts like hash functions.
On a ZK-enabled protocol, user transactions can be bundled and confirmed via a proof, all while protecting user balances, account details, and/or other sensitive details.
Multi-Party Computation (MPC)
Often associated with private key custody, multi-party computation is the term for a broad class of protocols that allow multiple parties to jointly compute a function over their inputs while keeping those inputs private. At a high level, here is how it works: a user inputs data, which is split into pieces and masked by adding random numbers. The pieces are then sent to multiple nodes to be computed, while none reveal the original information.
By using secret sharing techniques and distributing these pieces across participants, MPC networks eliminate the need to trust a single entity. A key assumption in MPC is that as long as nodes do not collude and do not reveal the secret shares to each other, data stays private and secure.
Fully Homomorphic Encryption (FHE)
FHE is a cryptographic technology that enables data processing without decrypting it. It’s a powerful tool that makes it possible to not only store data encrypted on a server but also for the server to compute without having to decrypt it. Under an FHE protocol, a user encrypts their private data and uploads it to a server, which can actually perform a computation directly on user data in ciphertext. The result is then decrypted by the user via their private key.
FHE can be used to create private smart contracts/dApps on public blockchains, where it’s possible to limit who can see transaction data and contract states. Just like MPC, provided some parties are honest (depending on threshold), private data remain secure.
Trusted Execution Environments (TEEs)
Finally, there are trusted execution environments (TEEs), which are hardware-based solutions that establish a secure enclave within an individual machine. These secure digital spaces facilitate cryptographic proofs, which confirm that the hardware and software are correctly configured and shield against unauthorized access or data modification.
This ensures both the privacy of the data and the integrity of the computation performed.
The Oasis Network approach to privacy combines TEEs with the blockchain. In short, you have a secure enclave; take some encrypted data, take some smart contract code, stick it all together, and you can get some encrypted results. You can have functions. You can have arbitrary logic. And when it’s an EVM, it’s called Sapphire, the first and only confidential EVM blockchain.
Sapphire has a network of nodes that run Intel SGX hardware, and, as a result, all transactions can be completely encrypted/private by default. Because TEEs deal directly with data within the secure enclave (versus secret pieces of data or encryptions), they can achieve near-plaintext computation speeds.
Onchain Privacy in an AI-driven World
No conversation about onchain privacy is complete without mentioning AI. As Web3 and AI converge, extending confidentiality (and verifiability) to AI will be paramount. And this is one place where crypto is perfectly positioned to lead. Indeed, decentralized AI offers (among other things) verifiability and confidentiality mechanisms.
An example. First, putting verifiability aside, if you’re delegating onchain finances or DeFi activities to an agent, it might be storing policies or processing prompts/other relevant data in plain text. This makes it highly vulnerable to compromise or manipulation, and this information may be used against you.
A related challenge is private key storage. To transact for you, an agent needs access to a wallet, and in this scenario, private key custody becomes a problem. The solution: the key can exist as a secret inside a smart contract (on a network like Sapphire) and get passed to a TEE so that not even the hardware owner can view the secret. Because the TEE holds private credentials, a human creator never has access to them, ensuring the sanctity of the keys.
This is the tip of the iceberg. And to this end, Oasis built Runtime Offchain Logic, or ROFL, a generalized computing framework that enables arbitrary applications (e.g., AI Agents) to function in a decentralized, verifiable, and confidentiality-preserving way. ROFL safeguards sensitive inputs/outputs, enables key storage for agents, and, in some cases, protects the models themselves. Explore ROFL further here.