Stay ahead, master crypto insights
2026-03-16 14:06
View OriginalChainThink News, March 16: According to official announcements, Bittensor subnet Templar (SN3) has successfully completed the world’s largest decentralized pre-training of a large language model, Covenant-72B, on March 10.
Notably, Covenant-72B is a 72-billion-parameter language model pre-trained by the Templar team on Bittensor Subnet 3, entirely powered by the general internet without reliance on centralized data centers. The model achieved a score of 67.1 on the MMLU (zero-shot) benchmark, outperforming centralized baseline models such as LLaMA-2-70B and LLM360 K2 under identical evaluation conditions. It stands as the largest fully permissionless collaborative language model to date, with over 70 distinct nodes contributing computational resources throughout its training lifecycle. The team has released all model weights and checkpoints under the Apache License.
Following this announcement, Bittensor (TAO) and its subnet tokens surged, with TAO up 54.8% over the past two weeks. Subnet token τemplar rose 194% in the past seven days, now trading at $19.3.
Disclaimer: Contains third-party opinions, does not constitute financial advice







This column focuses on the real progress of Agents: technological evolution, application implementat
Tracking on-chain movements of the smart money and institutions
Spotlight on Frontier, trending projects, and breaking events
As the 2026 crypto bear market deepens, exit scams and project blowups are becoming increasingly fre
American Crypto Act – timely interpretations of policies worldwide
Selected potential airdrop opportunities to gain big with small investments
FusnChain