The Web3 world is filled with too many storytelling projects. Open any crypto community, and you can see various grand narratives—metaverse, decentralized social, on-chain identity. These concepts sound beautiful, but most projects' business models still linger on the old path of 'issuing tokens-speculation-cashing out.' Investors are tired of hollow white papers and are starting to ask a fundamental question: can this project actually make money? KGeN's answer is quite straightforward: the annual recurring revenue has exceeded $80 million. This figure is not a forecast, not a beautiful vision on a roadmap, but a reality happening in January 2026. From $48.3 million in August 2025 to $70 million by the end of the year, and then to $80 million at the beginning of the year, KGeN's ARR curve has consistently maintained a steep upward trend. In an industry primarily driven by speculation, this growth stands out remarkably.
Many people ask me if Aster has been sold; the current price does not reach my valuation, so I must be holding on. Reasons for keeping Aster: 1. Differences in market environment October 2024 (HYPE): Early stage of a bull market, strong demand for contracts September 2025 (Aster): The market is relatively rational, establishing trust in spot trading is more important 2. Different competitive landscape HYPE advantage: At that time, competition in contract DEX was low, and technology was leading Aster challenge: Facing strong competitors like the mature Hyperliquid, it needs to establish a user base first 3. Token distribution strategy Aster airdrops 8.8% of the supply, preventing large-scale sell-offs Withdrawal lock ensures early liquidity is controllable Strategy assessment Aster is not a simple replication of HYPE, but a reverse strategy based on different market environments: Same goals: Control liquidity, gain price discovery dominance, build platform moats Different paths: Spot first vs contract first Adaptability: Rational choices based on the current market environment and competitive landscape Next step predictions Short term (2-4 weeks) More second-tier CEX spot trading will go live Completion of APX token swaps on first-tier exchanges like Binance Liquidity gradually improving but still relatively low Medium term (1-3 months) Derivatives trading will be launched, prioritized on the Aster platform Mainstream CEXs start to pay attention to ASTER contract demand Forming positive competition with HYPE Long-term risks If the spot phase cannot establish a sufficient user base, subsequent derivatives promotion will face difficulties Dispersed liquidity may affect trading experience, less effective than HYPE's concentrated strategy Aster has chosen a more conservative but possibly more suitable strategy for the current environment, hoping for its success!#空投大毛
Programmable Storage—The Shift in Development Paradigm Brought by Walrus Meeting Sui
As a developer who often interacts with storage, I was indeed amazed the first time I used Walrus's CLI tool to upload a 2GB model file. It wasn't because of how fast or cheap it was, but rather the seamless flow of the entire process. It took only five commands from initial configuration to finally obtaining the storage proof. This kind of development experience is almost unprecedented in the field of decentralized storage. The deep integration of Walrus with Sui brings an interesting concept—programmable storage. This means that storage resources are no longer passive data warehouses but can be actively managed and combined as intelligent objects. For example, you can write a smart contract on Sui to automatically renew storage time or set it to trigger storage migration automatically when a certain NFT is transferred. This flexibility is something traditional storage solutions cannot offer.
Before going to bed, I also received rewards from the square. Binance's efficiency is indeed fast, listening to player feedback, and the action is superb. The rewards for the creator task leaderboard will be changed to be distributed every 14 days after the project goes live, with the total prize pool evenly divided based on the number of distributions and the duration of the activity. I believe this new mechanism will bring more frequent recognition and incentives to all creators.
The Battle for Blockchain Compliance and Privacy: How Dusk Overwhelms the RWA Track
While traditional financial giants are still struggling with on-chain compliance issues, Dusk has provided the ultimate solution with a sophisticated multi-layer modular architecture. This is not a pseudo-innovation that merely changes the surface; it genuinely reconstructs the logic of how blockchain serves the financial market from the ground up. Their recently launched DuskDS, DuskEVM, and DuskVM three-layer architecture is like dropping a deep-water bomb on the industry, making those competitors still using a single-chain structure appear particularly outdated. Let's first talk about the brilliance of this architecture. DuskDS, as the underlying layer responsible for consensus and data availability, integrates the EIP4844 prototype sharding technology. The most critical aspect is that they have implemented a pre-validation mechanism, completely eliminating the awkward situation of needing a 7-day dispute window like Optimism. What does this mean? It means that institutional users no longer have to worry about transaction finality issues. Settlement is completed instantly, which is a revolutionary breakthrough for high-frequency trading scenarios.
The Conclusion of the Stablecoin War: Why Plasma is Eating Tron's Lunch
When I saw Plasma's zero-fee architecture, it felt like this thing was challenging the laws of physics. How could a blockchain not have Gas fees? But upon deeper research, I found that this is not magic; it is mathematics—an extreme optimization for stablecoin scenarios. Currently, transferring USDT on the Tron network costs over 7 dollars, while Plasma brings it down to zero. This difference is no longer a quantitative change but a qualitative one. We need to first understand why Tron can become the king of stablecoins. Essentially, it benefited from the era's dividends, entering emerging markets early with the support of Binance, forming a huge liquidity network. However, the problem is that Tron's architecture is generic. It has to handle various NFT, DeFi, and gambling applications. Stablecoin transfers are just one use case. It's like using a Swiss army knife to cut steak: it can work, but it's inefficient.
The Breakthrough of Tokenized Assets—How Dusk Transforms RWA from Concept to Reality
The term RWA has been hotly debated in the past year, with various projects claiming to be working on asset tokenization. However, upon closer inspection, most are still in the PPT stage, and very few can actually be implemented. The core issue is that tokenizing real-world assets is not purely a technical problem; it involves a series of complex steps related to legal compliance, asset custody, and value anchoring. Many technically advanced public chain projects struggle with these aspects. Dusk's approach is quite unique; they build compliant infrastructure by starting with licensed institutions and then empower it with technology, rather than developing technology first and then finding a use case.
When Privacy Meets Compliance—How Dusk Solves the Biggest Challenge of Traditional Finance Going On-Chain
To be honest, when I first encountered Dusk, I was quite confused. There are already so many public blockchains in the market; why is there a need to specifically create a chain focused on privacy compliance? However, after delving deeper, I found that this project hits the pain point incredibly accurately. The core contradiction for traditional financial institutions wanting to go on-chain but not daring to do so lies here—protecting customer privacy while meeting regulatory requirements. These two seemingly contradictory demands are almost unsolvable in existing blockchain solutions. The solutions available in the market are either completely transparent public blockchains like Ethereum, where all transaction records are publicly accessible, which is a disaster for financial institutions. Imagine your account balance at JPMorgan and every transfer being visible to the entire network—who can tolerate that? Or they go to the other extreme, like Monero, which has a completely anonymous solution. While privacy protection is in place, regulatory bodies simply do not accept it. Such un-auditable systems cannot operate in compliance within the current legal framework. The EU's MiFID II and MiCA regulations clearly require that financial transactions must be traceable and auditable. This is why we see many exchanges gradually delisting these types of privacy coins.
Every new protocol that comes out is often hyped up, and Walrus is no exception. Decentralized storage, AI data market, 5 times cost efficiency—these labels sound beautiful, but there are indeed quite a few problems in practice. As someone who has been involved with Walrus since mid-2025, I think it is necessary to write down these pitfalls to prevent future participants from stepping on landmines. Of course, while it’s okay to criticize, this project still has its value; otherwise, I wouldn’t continue to pay attention. The most glaring issue is actually the security audit. Certik's Skynet gave Walrus a security score of 30 points, which is a bit frightening. Specific issues include the lack of X-Frame-Options, HTTP Strict Transport Security (HSTS), X-Content-Type-Options, and other security headers. There is no Content Security Policy (CSP), it still allows HTTP access, and even uses a self-signed certificate, with the certificate host not matching. These are all fundamental issues in web security. For a project that aims to do decentralized storage to mess up on these details is somewhat unacceptable. Although Hackenproof has listed the smart contract audit projects for Walrus, as of January 2026, I have not seen a detailed audit report made public. This lack of transparency may make some major clients hesitate.
General Liang Xi's explosive popularity relies on his cross-field talent and the extreme ups and downs of the cryptocurrency market, making him a "legendary trader" widely discussed online. His experiences and achievements are marked by extreme contrasts: in sixth grade, he reached the rank of Master in "League of Legends", showcasing his gaming talent; he dropped out of middle school to switch to "Honor of Kings", directly achieving a top three finish in peak competitions and ranking first on the national server leaderboard as Liu Bei, becoming a notable figure in the gaming community. Even more remarkably, he earned the title of national level one swimmer at the age of 10 and even made it to the top three in a national swimming competition, proving himself to be a versatile talent. What truly made him go viral online was his extreme trading experiences in the cryptocurrency market, where he rose and fell dramatically, with core achievements and life fluctuations that are nothing short of magical: 2021 was his moment of glory, entering the market with only 1000-3000 yuan in capital, coinciding with the 519 market surge; through precise operations, he made around 40 million yuan in just one month, jumping from a small trader to a top player in the cryptocurrency scene. However, after reaching his peak, he faced a sharp decline; during the bull market rebound phase, his high-leverage long positions led to a misjudgment, resulting in a liquidation that wiped out his earnings and left him with a rumored debt of around 200 million yuan, pushing him to the brink of suicide with pesticide, though fortunately he did not go through with it. Despite multiple falls to the bottom, he did not exit the market; from 2024 to 2025, he experienced a strong comeback, repeatedly creating myths of high profits with small capital—starting with 2000U, making millions in just a few hours. In 2025, he even recorded a single night's earnings of over 10 million U, gradually repaying his previous massive debts with these profits. It is precisely because of this extreme experience that he is referred to by netizens as the "legend trapped in leverage"; he has faced multiple zeros in the cryptocurrency market yet always managed to flip the situation again through his exceptional trading skills. His life trajectory of extreme highs and lows has made him one of the most controversial and followed figures in the cryptocurrency space. Then his account was deleted.
Ondo Finance unlocks $772 million ONDO tokens, RWA leading valuation under pressure to digest Focusing on the tokenization of real-world assets, Ondo Finance today unlocked 1.94 billion ONDO tokens worth $772.4 million, accounting for 57.23% of the released supply. This is one of the largest single unlocking events this year, which will create huge supply pressure on the ONDO price in the short term and test the market's absorption capacity. Ondo Finance is a star project in the RWA sector, with its tokenized U.S. Treasury product OUSG managing over $500 million, providing on-chain fixed income solutions for institutions and individuals. After traditional giants like BlackRock entered the space, Ondo still maintains its competitive advantage in technology and products. However, this unlocking volume is simply too large, almost equivalent to doubling the circulating supply. From the unlocking structure, this batch of tokens is mainly used for protocol development, ecosystem expansion, and the unlocking of private investors. Among them, the selling pressure from private investors is the most concerning because these early investors may have costs that are only a fraction of the current price. Even if they sell at half the price, it would still be a huge profit. Historical data shows that most projects experience a price pullback of 30-50% after similar scale unlocks. However, Ondo's fundamentals remain robust. RWA is a truly validated demand sector by multiple parties, especially in a high-interest rate environment where on-chain U.S. Treasury yields have attracted substantial capital. Ondo's TVL has grown by 300% over the past year, and the protocol's revenue is also steadily increasing. This ability to generate revenue is something many DeFi projects do not possess.
Separating settlement and execution sounds beautiful, but in practice, many unexpected issues will arise. DuskDS is responsible for consensus and data availability, while DuskEVM and DuskVM run applications on top of it. This separation of responsibilities indeed allows each layer to focus on optimization, but the complexity of inter-layer communication grows exponentially. The delays in message passing, guarantees of state consistency, and cascading error handling can all cause problems at every stage. Although Proto-Danksharding from EIP-4844 compresses data, the pricing and availability guarantees of blob data are still evolving. Dusk relies on this feature to control costs. If the specifications on the Ethereum side change, Dusk must adjust accordingly. Such external dependencies are the cost of modularization. A deeper issue is the leakage of abstraction. Theoretically, each layer should be a black box, but in actual development, the limitations of the underlying layer can penetrate to the upper layer. For example, the 2-second block time of DuskDS determines the lower limit of final confirmation delay for DuskEVM. If an application requires faster confirmation, it must change the underlying consensus, which will affect all other applications. The flexibility of modularization becomes a shackle in this situation. The advantage of a monolithic chain is that everything is under control, with clear optimization paths. The advantage of modularization is clear division of labor, making upgrades easier. Dusk chooses the latter because compliance scenarios require clear hierarchical separation. Regulatory bodies only need to review the settlement layer, without concerning themselves with execution details. This clarity in architecture is crucial for permissioned markets. However, this also means that Dusk has to bear a higher engineering complexity. Testing, deploying, and monitoring multi-layer systems are much more difficult than monolithic ones. Whether the team size can support the long-term maintenance of this architecture is a question mark. The fact that protocols like Sozu can run indicates that the ecosystem is developing, but the depth is not enough. More developers need to participate to validate the feasibility of the architecture. There is no right or wrong in the technical route, only whether it is suitable for specific scenarios. Dusk bets on the explosion of the regulatory market. If this market takes off, the advantages of modularization will be amplified. If the market does not thrive, the complex architecture may instead become a burden. $DUSK #Dusk @Dusk
At the end of last year, the Haulout hackathon attracted over 800 participants, ultimately submitting more than 280 projects. These projects broadly outline the application map for Walrus in the future, and several winning works are particularly noteworthy. The data market track had the most projects, which is not surprising as Walrus is naturally suitable for trading and sharing large files. One team created an open-source platform for physical AI models, allowing research institutions to upload their trained models. Other developers can pay to download and use them. The model files are stored on Walrus and encrypted with Seal, and after payment, they automatically gain decryption rights. The entire process is settled on-chain, ensuring transparency and auditability. This model is difficult to achieve on traditional platforms because it cannot prevent buyers from sharing downloads with others. Blockchain access control solves this issue. The AI x Data track is also very interesting. The winning project, Hyvve, focuses on the AI training dataset market. Data providers upload datasets to Walrus, set prices and access rules, and AI developers purchase them for model training. The key here is that both the source and quality of the data have on-chain proof, avoiding the issue of inferior data flooding the market. This is a breakthrough for data transparency in the AI industry. There is also a verifiable code hosting project, similar to GitHub but fully decentralized. Code repositories are stored on Walrus, and every commit has an on-chain record that is immutable. Combined with Sui's code verification tools, it can prove that a specific version of the code was indeed submitted at a certain time. This has immense value for open-source collaboration and code auditing. These projects are still in the demo stage, but the direction is clear. Walrus's killer application is not to replace personal cloud storage like Dropbox, but to empower scenarios that require data sovereignty, verifiability, and resistance to censorship—such as AI, scientific research, creator economy, and open-source collaboration. The demand for decentralized storage in these fields is much stronger than that of ordinary users. @Walrus 🦭/acc $WAL #Walrus
When Erasure Coding Meets the Sui Chain: How Walrus Redefines the Play of Decentralized Storage
Storing this track is honestly a bit like an arms race for the infrastructure of the blockchain world. You see, Filecoin has been calling for decentralized cloud storage for so many years, but the actual usage is still a bit of a headache. The retrieval is incredibly slow, not to mention that the costs are not as rosy as imagined. On the other hand, Arweave has taken a different approach, promoting permanent storage which sounds cool, but that one-time payment model puts a lot of pressure on many projects. At this moment, the emergence of Walrus becomes quite interesting. It is built on the high-performance public chain Sui and uses a set of erasure coding technology called Red Stuff. Just from the name, you can tell it's not a traditional solution.
The storage needs and general applications of the podcast industry are quite different. Audio files can easily reach several hundred MB. The traditional approach is to throw them onto SoundCloud or build a private CDN, but these solutions all have risks of censorship and single points of failure. Unchained, as a well-known podcast in the crypto field, chose Walrus to store its media library. This decision is worth analyzing. The core pain point of podcasts is the permanence of content and resistance to censorship. After doing a show for several years, a policy adjustment by the platform could lead to everything being taken down. Walrus's decentralized storage ensures that as long as the network exists, the content can be accessed forever. No one can unilaterally delete your audio, which is a necessity for independent media. Another consideration is cost. Unchained may have several hours of audio per episode, along with multilingual versions and edited clips, leading to a substantial amount of storage. Traditional CDNs charge based on traffic, and when popular shows are played a lot, the bills can skyrocket. Walrus charges based on storage duration. A one-time payment can ensure data stays on the network for 2 years, and renewals are also quite cheap. This predictability in cost is very friendly for content creators. From a technical implementation perspective, Walrus's edge caching allows popular shows to load at speeds comparable to centralized CDNs. Coupled with playback records and subscription management on the Sui chain, the entire podcast system can operate in a fully decentralized manner. User subscriptions, listening history, and comments are all on-chain, eliminating worries about platforms going bankrupt or data being misused. Of course, at this stage, Walrus's ecosystem tools are still not rich enough. The Unchained team may need to develop some integration code themselves, but this early adopter investment is worthwhile. Once the decentralized podcast infrastructure matures, the entire power structure of the industry will change. Creators will no longer be subject to platforms, and listeners will truly own their content libraries. @Walrus 🦭/acc $WAL #Walrus
In the field of cloud storage, AWS S3 is the de facto standard with latency in milliseconds, availability at 99.99%, and flexible pay-as-you-go pricing. The decentralized solution Walrus must address a crucial question to capture the market: how much performance is sacrificed? Data doesn't lie. The hot storage latency of AWS S3 is indeed in the millisecond range, but that is under ideal conditions of single-region access. Once cross-region access or regional failures are involved, the problems of S3 become apparent. Last year, there was a case where an outage in a certain AWS region caused numerous applications to crash, while applications running on Walrus, due to their global node distribution, were unaffected. This resilience is something that centralized architectures cannot provide. In terms of latency, Walrus's hot read is less than 2 seconds. It may seem slower than S3, but it is important to consider that this is achieved under the premise of Byzantine fault tolerance and data validation, where each read can prove that the data has not been tampered with. S3 only offers a promise of trust without cryptographic proof. For scenarios requiring auditing and compliance, the verifiable value contained in those 2 seconds of Walrus far exceeds the superficial numbers. The cost comparison is even more interesting. S3 standard storage is approximately $0.023 per GB per month, totaling about $0.276 per year. Walrus is $0.02 per year, which is more than 10 times cheaper. Of course, S3's advantage lies in its mature ecosystem, with various tools and services ready to use out of the box. Walrus is still in its development phase, and the developer experience and surrounding tools cannot yet compete. However, if we consider it from a different perspective, if your application has requirements for audit resistance, data sovereignty, and transparency—such as storing sensitive documents, AI training datasets, or NFT metadata—then the performance gap with Walrus is entirely acceptable. Moreover, as network expansion and caching optimizations continue, this gap is narrowing. Decentralization is not about being slow; it is about eliminating single points of risk while ensuring performance. @Walrus 🦭/acc $WAL #Walrus
The testnet ID 745 has been open for almost a month. From the deployment documentation on GitHub and community feedback, the stability has basically met the standards, but there is still a distance to be production-ready. First, there is the Gas pricing model. The computational cost of ZK proofs is completely on a different scale than traditional EVM operations. The current Gas table is still being adjusted, and the fees for complex privacy transactions fluctuate greatly. Sometimes, the Gas consumption for a transfer involving homomorphic encryption is more than ten times that of a regular transfer. This will directly affect user experience and must be stabilized before going live on the mainnet. Next is the compatibility issue of the toolchain. Although the official statement supports Hardhat and MetaMask, in actual use, there are some edge cases. For example, the format of event logs varies in privacy transactions, requiring the front end to perform additional parsing. These small issues accumulate and can slow down developers' migration speed. EVM compatibility is not 100% or 0; the differences in details can be magnified. Then there is the delay in state synchronization. DuskEVM needs to stay consistent with DuskDS. The delay in cross-layer communication can be amplified during network congestion. Occasionally, in the testnet, we can see that state confirmation is slower than expected. This delay when processing real funds on the mainnet could trigger arbitrage or liquidation risks. The most concerning issue is the progress of security audits. The OP Stack itself is verified code, but the modifications made by Dusk, especially the integration of the privacy layer, require independent audits. Currently, there is no public audit report available, which poses a risk for a chain that is expected to handle institutional funds. The postponement of the mainnet to 2024 also indicates that the team is being cautious about security, but transparency could be higher. The significance of the testnet is to expose issues. Discovering these now is better than encountering problems after the mainnet goes live, but the time window for fixes is narrowing. $DUSK #Dusk @Dusk
2 seconds to generate a block sounds like nothing, but a definitive conclusion is key. Many chains can achieve fast block generation but fail to provide instant finality. Succinct Attestation uses a committee voting model; stakers holding a predefined amount of DUSK can participate. After candidate block generation, it goes through two rounds of voting: the first round is pre-confirmation, and the second round is final confirmation. Once it enters the ratification stage, it becomes irreversible. Users will not see reorgs, and this certainty is crucial for financial settlement. The TARGET2 real-time settlement system requires settlement verification to be completed in 2 to 15 seconds. Dusk's 2-second finality time fits perfectly within this lower window, meaning that the settlement of securities can be executed in real-time on-chain without the need for external custody or delayed batch processing. In traditional financial infrastructure, instant full settlement RTGS is the highest standard, and now it has been migrated to the chain. Of course, this performance comes at a cost; the committee system means a certain degree of centralization. Although committee members rotate, the set of validators at any given time is limited, which weakens decentralization compared to fully open PoW or large-scale PoS. However, for permissioned scenarios aimed at institutions, this trade-off is acceptable. Regulators actually prefer networks where validators can be identified, as there is a clear party responsible when issues arise. Another issue is the performance of Succinct Attestation under extreme network partition conditions. The document mentions high throughput and low latency but does not detail the liveness guarantees under adverse network conditions. Financial markets occasionally experience network congestion due to liquidity crises. Whether the consensus mechanism can maintain stability in such situations still needs real-world testing; theoretical models and stress testing are two different things. $DUSK #Dusk @Dusk
Ethereum often encounters bottlenecks on RWA. On-chain data is chaotic. When handling real estate or invoices, proxies need to repeatedly pull off-chain information, resulting in high latency and costs. Vanar Chain's complete AI stack directly tackles this problem with a five-layer design progressing from the foundational layer to the Flows layer. The foundational layer provides high throughput and EVM compatibility, allowing developers to seamlessly integrate. Neutron compresses data into Seeds, Kayon infers and generates compliance proofs, and Axon automates execution—all on-chain, eliminating reliance on oracles in practical use. Vanar shines when tokenizing real assets, such as converting invoices into queryable objects, enabling automatic auditing by proxies. Compared to Solana, while it is fast, its structured memory is lacking; when data increases, proxies become chaotic. Vanar's UDF storage keeps everything organized, with costs as low as a few cents per transaction. Developer feedback indicates that when building the PayFi dApp, Vanar's SDK makes AI integration as simple and efficient as building with blocks. Polygon is scalable, but its AI capabilities are weak, requiring an additional middleware layer that increases the attack surface. Vanar offers end-to-end intelligence, enabling proxies to learn from historical data and predict risks. For example, during cross-chain transfers of RWA, automatic verification avoids human errors, addressing the trust issues of traditional chains. After launching the application, transaction volumes steadily rise because users find it reliable and engaging. Vanar is not just a stack of technology; it brings RWA from concept to everyday use. Proxies actively optimize processes, making Web3 more grounded. Consider this potential: Vanar's stack will shine in the AI era, not just as passive storage, but as proactive intelligence, leading competitors by a significant margin. @Vanarchain $VANRY #Vanar