Elon Musk says that Anthropic’s Claude Sonnet model has 1 trillion parameters and Claude Opus has 5 trillion parameters. XAI’s Grok 4.20, which has 0.5 trillion parameters and there are seven larger models being trained at XAI.
Grok Imagine V2
2 variants of 1-trillion-parameter models
2 variants of 1.5-trillion-parameter models
1 variant of a 6-trillion-parameter model
1 variant of a 10-trillion-parameter model
0.5T total. Current Grok is half the size of Sonnet and 1/10th the size of Opus.
Very strong model for its size.
— Elon Musk (@elonmusk) April 9, 2026
Claude Mythos is probably Mythos 10 trillion parameters.

Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.

