Elon Must introduces Grok, X's answer to ChatGPT
Elon Musk on Saturday announced the launch of a new large language generative AI model — Grok, which is said to be modelled after “Hitchhiker's Guide to the Galaxy” and is intended to “answer almost anything and, far harder, even suggest what questions to ask!”
The Tesla and Space X CEO appears to be positioning xAI as a challenger to companies like OpenAI, Inflection and Anthropic. Grok is supposed to have “a bit of wit,” “a rebellious streak” and it should answer the “spicy questions” that other AI might dodge, according to a Saturday statement from xAI.
Grok is a term coined by Robert A. Heinlein in his 1961 science fiction novel “Stranger in a Strange Land.” In the book, 'grok' is a Martian term with no direct Earthling translation.
Grok also has access to data from X, which xAI said will give it a leg-up. Musk, on Sunday, posted a side-by-side comparison of Grok answering a question versus another AI bot, which he said had less current information. On an initial round of tests based on middle school math problems and Python coding tasks, the company said that Grok surpassed “all other models in its compute class, including ChatGPT-3.5 and Inflection-1.” It was outperformed by bots with larger data troves.
In this race for AI, whoever has more data wins. This is part of the reason Facebook, Google etc have had an advantage. X is the latest to join the bandwagon and can train the models with the latest data endlessly via its never-ending data stream. In September, Oracle co-founder Larry Ellison said that xAI had signed a contract to train its AI models on Oracle's cloud.
Musk said that Grok will be provided with X-premium, which should set a user back by $16 per month.
The @xAI Grok AI assistant will be provided as part of 𝕏 Premium+, so I recommend signing up for that.
— Elon Musk (@elonmusk) November 4, 2023
Just $16/month via web. https://t.co/wEEIZNjEkp
Still, xAI hedged in its statement, as with any Large Language Model, or LLM, Grok “can still generate false or contradictory information”. Grok-1 has been trained using a custom training and inference stack based on Kubernetes, Rust, and JAX and has access to the internet with real-time access to the latest information.
The beta version of Grok is currently available to a limited number of users in the US and up untill it reaches a wider audience which can assess its capabilities in real-world scenario, we will have to rely on the folks X has chosen to grant the beta.