For the past few years, I have been working with AI models. I exchange datasets with friends, refine open-source models, and train small models. I use cloud services or platforms like Hugging Face to store things most of the time. Although they are practical, I frequently worry about what will happen if a server goes down, if a corporation modifies its policies, or if I eventually lose access. The data doesn't feel quite like mine.
I've been reading lately about how blockchain technology can alter this. It seemed beneficial to store big files, such as datasets or AI models, directly connected to the chain in a decentralized manner. It is not controlled by a single business. A network of nodes keeps the data accessible. Walrus can help with that.
The Sui blockchain is the foundation of the decentralized storage technology Walrus. Large, unstructured data, such as photographs, movies, datasets, or even complete AI model weights, are what it is designed to handle. These are referred to as "blobs" in Walrus. Walrus divides the data, provides redundancy via erasure coding, and distributes it among numerous independent nodes rather than keeping files on a single central server. The data can still be recovered even if some nodes malfunction or go down. This is known as Byzantine fault tolerance, which simply means that the system continues to function dependably in challenging circumstances.
I was drawn to Walrus because it meets the demands of AI work. Decentralized training and on-chain AI agents are frequently discussed, but they require a secure location to store large datasets. Risks associated with centralized storage include loss of control, censorship, and downtime. With Walrus, you maintain true ownership through Sui objects while the data is dispersed and verifiable. It is already being used by projects like Talus and FLock.io to safely store AI-related data on chain.
I made the decision to give it a shot. About 2 GB of model weights from a publicly available, refined language model was the size of the AI dataset I wanted to upload. Just something big enough to seem real, nothing hidden. This is a step-by-step account of how things went. If you are as new to AI and cryptocurrency as I was when I first started, I hope this is helpful.
Getting Ready to Use Walrus
I needed a Sui wallet first. If you don't have one, you can easily set one up using alternatives like Sui Wallet or others that support the network. I already had one because I tried other stuff on Sui.
As of early 2026, Walrus operates on Sui's mainnet, so actual coins are used throughout. You require:
A few SUI tokens for gas fees (the price of chain transactions).
WAL tokens to cover storage costs.
I purchased a modest quantity of SUI on an exchange and used the standard bridges or DEXes on Sui to exchange portion of it for WAL.
I then set up the Walrus CLI tool. For direct uploads, it is the most straightforward method. You are guided through it by the official documents. After installing the suiup tool, you add the walrus command. It takes just a few minutes.
Additionally, I examined publish.walrus.site's basic web uploader. People use it for single files or for short experiments. Smaller blobs work great with it, but I wanted more control over my 2 GB file, so I went with the CLI.
The Process of Uploading...
The upload was simple after everything was configured.
I launched my terminal and typed the following command:
"walrus store my-dataset-file.gguf --epochs 30".
"my-dataset-file.gguf" was my 2 GB file (a common format for quantized model weights).
"--epochs 30" meant I wanted to store it for about 30 epochs. Each epoch on mainnet is around two weeks, so that gives roughly a year's worth of storage to start with. You can always extend later.
The rest was handled by the tool. The file was encoded using Walrus's RedStuff technique, a kind of erasure coding that strengthens the contents. Slivers of the entire file are created and sent to storage nodes. The price is paid in WAL and is approximately determined by multiplying the size by the number of epochs.
Due to the size, the upload took a while—about 20 to 30 minutes on my connection. I saw the development while I waited. The CLI displayed the data splitting and transmission process.
When it finished, I got two important things:
A Blob ID – a unique hash that points to the exact content. Anyone with this ID can read the data from any Walrus gateway.
A Sui Blob Object ID – this is an on-chain object that proves certification and lets me control the storage. I own this object in my wallet. Only I can extend the storage period or remove the certification if I want.
That was it. The dataset was now stored decentrally.
Retrieving and Using the Data
To test, I downloaded it back on another computer:
"walrus read <blob-id> --out recovered-file.gguf"
The file matched perfectly – byte for byte. No corruption.
Since the Blob ID is public, I could also share it directly. Gateways like walrus.site let anyone fetch it over HTTP without needing the CLI.
The nice part is the ownership. The Sui object means I have proof that this blob is certified for the storage period I paid for. If I want, I can extend it later by sending more WAL. Or I can build smart contracts that reference this blob for on-chain applications.
Why This Feels Different: No Central Risks, Real Ownership
Conventional cloud storage relies on a single supplier. Your data won't be accessible if their servers are having problems. You may lose access if they make changes to their policies. I've previously saw datasets vanish from platforms.
There is no single point of failure while using Walrus. Hundreds of nodes house the data; the precise number increases as the network grows. The file remains accessible as long as a sufficient number of nodes are truthful. It can endure even if a significant percentage of nodes fail thanks to erasure coding.
Additionally, ownership is more obvious. I own the Sui item. Both the storage commitment and the metadata are under my control. No one can alter the content itself without making a new blob. For AI work, this verifiability is useful when you want to demonstrate that a model originated from a particular dataset. Expenses are clear. In epochs, you only pay for what you use. Long-term costs are higher than those of inexpensive cloud storage, but you gain decentralization and on-chain features in exchange.
Thoughts After My First Try
It was easier than I had anticipated. I didn't feel overwhelmed because I was new to cryptocurrency and came from AI tools. The documentation was straightforward and the CLI operated consistently.
This seems like a small but significant step toward the development of open AI. Imagine exchanging datasets or models so that anyone may access and verify them without having to rely on a central service. or creating agents that retrieve information straight from on-chain blobs. This is already being done by projects, including securely storing advertising datasets, health data for AI, and training outcomes.
Walrus is a fantastic place to start if you're new to the AI-crypto sector and want to try storing your own models or data on chain. It provides you actual control and eliminates some of the vulnerability of central servers. Start with a small file and work your way up to a larger one, as I did.
I am keeping this dataset stored for now. It feels good knowing it is out there, distributed and owned by me.


