Elon Musk’s “Grokipedia” Is, of Course, Not Wikipedia

Wikipedia is a valuable online resource that, despite the internet’s massive changes, has managed to maintain its excellent appearance to this day. I, like millions of other users, visit this site daily to learn something new or double-check existing knowledge. In an age of constant AI addiction, Wikipedia is a kind of antidote.
If you look at Wikipedia and think, “It’s fine, but an AI version would be so much better,” you might be Elon Musk. Musk’s AI company, xAI, just launched Grokipedia (yes, that’s a real thing)—an online encyclopedia very similar to Wikipedia in name and appearance. But underneath, the two encyclopedias couldn’t be more different. While this new “encyclopedia” is still in its early stages, I’d say it’s not worth using, at least not for anything serious.
Grokipedia Experience
When you load the Grokipedia website, it looks pretty standard. You see the Grokipedia name, the version number (at the time of writing, v0.1), a search bar, and a “Available Articles” counter (885,279). Finding an article is also simple: you enter a query, and a list of available articles appears from which you can select the one you need. When you open an article, it looks like Wikipedia, albeit in a very basic way: no images, just text, though you can use the sidebar to navigate between article sections. You’ll also find references, identified by numbers, which correspond to the “References” section at the end of each article.
However, a key difference between Grokipedia and a simplified version of Wikipedia is that these articles are not written or edited by real people. Instead, each article is created and “verified” by Grok, xAI’s Large Language Model (LLM). LLMs are capable of generating large volumes of text in short periods of time and can cite the sources from which they obtain their information, which may seem appealing to some for the idea of Grokipedia. However, LLMs are also prone to hallucinations , or, in other words, to fabrication. Sometimes the sources from which the AI obtains information are unreliable or humorous; other times, the AI takes it upon itself to “lie” and generate text that simply doesn’t correspond to reality. In both cases, the information cannot be trusted, especially if it is untrue, so it’s alarming to see that much of the experience is entirely powered by Grok, without human intervention.
Grokipedia vs. Wikipedia
Musk positions Grokipedia as a “significant improvement” over Wikipedia, which he has criticized for its propaganda, particularly for left-wing ideas and politics. Ironically, some of these Grokipedia articles themselves borrow from Wikipedia. As Jay Peters of The Verge notes , articles like those about the MacBook Air include the following at the bottom: “This content is adapted from Wikipedia and licensed under a Creative Commons Attribution-ShareAlike 4.0 license.” Furthermore, Peters found that some Grokipedia articles, such as those about the PlayStation 5 and Lincoln Mark VIII , are almost entirely copied from their corresponding Wikipedia articles.
If you’ve followed Musk’s politics and activities in recent years, you won’t be surprised to learn that he holds right-wing views. This may give pause to anyone who views Grokipedia as an unbiased source of information, especially given that Musk constantly restructures Grok to produce answers more favorable to right-wing views. Critics like Musk claim that Wikipedia is left-leaning, but Grokipedia is entirely constructed by an artificial model with obvious bias.
You’ll find that your reading experience on some topics on Wikipedia and Grokipedia differs significantly. For example, the Wikipedia article on Tylenol states the following:
In 2025, Donald Trump made several claims about a controversial and unproven link between autism and Tylenol use. These claims about a link between Tylenol use during pregnancy and autism are based on unreliable sources and lack scientific evidence.
Compare this with Grokipedia , which devotes three paragraphs to this topic, the first of which begins:
Numerous observational studies and meta-analyses have found an association between prenatal exposure to acetaminophen (the active ingredient in Tylenol) and an increased risk of neurodevelopmental disorders (NDDs) in offspring, including attention deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD).
The second paragraph, however, highlights some of the problems with these studies, and the third emphasizes that some agencies suggest that “the benefits outweigh the unproven risks.”
Similarly, as WIRED noted , the Grokipedia entry for “Transgender ” emphasizes the notion that social media may have been a “contagion” for the rise of transgender identity. Not only is this a common right-wing claim, but the word itself may have been lifted from a post by the far-right account X. The Wikipedia entry , predictably, doesn’t support this claim at all.
Grokipedia also favors unproven, controversial, or downright absurd claims. As Rolling Stone notes , the magazine refers to “Pizzagate”—a conspiracy theory leading to a real-life shooting—as “accusations,” “hypotheses,” and “stories.” Grokipedia endorses the “Great Replacement,” a racist theory promoted by white supremacists.
Is Grokipedia worth using?
Here’s the short answer: no. I have two complaints about Grokipedia: first, no encyclopedia will be reliable if it’s generated almost entirely by artificial intelligence models. Sure, some of the information may be accurate, and it’s great to be able to see the sources the bot uses, but when the risk of hallucinations is built into the technology itself and there’s no way to circumvent it, eliminating human intervention en masse practically guarantees inaccuracies that will taint much of Grokipedia’s knowledge base.
As if that weren’t enough, this Grokipedia is built on the foundation of a master’s degree in law, which Musk openly modifies to produce results that more closely align with his worldview and the worldview of a specific political ideology. Hallucinations and bias are just what an encyclopedia needs.
The unique thing about Wikipedia is that it’s written and edited by humans . These humans can hold other authors accountable by adding new information as it becomes available and correcting errors when they encounter them. It might be disturbing to read that your favorite Secretary of Health and Human Services “promoted vaccine misinformation and public health conspiracy theories,” but that’s the objective, scientific reality . Removing these objective descriptions and reframing the discussion to fit a distorted worldview doesn’t make Grokipedia better than Wikipedia—it renders it useless.