From Surf Wiki (app.surf) — the open knowledge base
Content farm
Organization that creates web content optimised for views
Organization that creates web content optimised for views
A content farm or content mill is an organization focused on generating a large amount of web content, often specifically designed to satisfy algorithms for maximal retrieval by search engines, a practice known as search engine optimization (SEO). Such organizations often employ freelance creators or use artificial intelligence (AI) tools, with the goal of generating large amounts of content in the shortest time and for the lowest cost. The primary goal is to attract as many page views as possible, and thus generate more advertising revenue. The emergence of these media outlets is often tied to the demand for "true market demand" content based on search engine queries. Content farms have been criticized for their reliance on sensationalism, misinformation, and, following the 2020s AI boom, a reliance on generative artificial intelligence, all of which have degraded the accuracy of information in circulation.
History
Historically, content farms have outsourced the creation of their content to individuals in poorer countries to enlarge profit margins by keeping workers' pay low. These operations increasingly leverage AI tools to generate content at an accelerated pace. This content can be anything that circulates on the internet, e.g., videos, news articles, social media posts, or blogs.
The rise of digital advertising industry incentivized the rise of content farms, as a content creator's ad revenue is proportional to the number of users that visited their content (and by extension, the advertisements alongside it).
Characteristics
Some content farms produce thousands of articles each month using freelance writers or AI tools. For example, in 2009, Wired reported that Demand Media, owner of eHow, was publishing one million items per month, the equivalent of four English-language Wikipedias annually. Another notable example was Associated Content, purchased by Yahoo! in 2010 for $90 million, which later became Yahoo! Voices before shutting down in 2014.
Pay scales for writers at content farms are low compared to historical salaries. For instance, writers may be paid $3.50 per article, though some prolific contributors can produce enough content to earn a living. Writers are often not experts in the topics they cover.
Since the rise of large language models like ChatGPT, content farms have shifted towards AI-generated content. A report by NewsGuard in 2023 identified over 140 internationally recognized brands supporting AI-driven content farms. AI tools allow these sites to generate hundreds of articles daily, often with minimal human oversight.
Criticism
Critics argue that content farms prioritize SEO and ad revenue over factual accuracy and relevance. Critics also highlight the potential for misinformation, such as conspiracy theories and fake product reviews, being spread through AI-generated content. Some have compared content farms to the fast food industry, calling them "fast content" providers that pollute the web with low-value material. The word "sponsored" displayed when searching has raised questions on the reliability of the site, as it was likely paid to be pushed to the top of the search options.
Criticisms of AI and content farms have coalesced because of the new use of AI tools and AI's tendency to "hallucinate" facts. AI's permeation of journalism, even in examples some consider trivial, like a summer reading list published by the Chicago Sun-Times which was written by AI, have created distrust of artificial intelligence. The prevalence of AI to aid in the creation of content for the purpose of monetization has increased and become common on the internet.
Social media content farm accounts totaling hundreds of thousands or millions of followers are not a rarity either. Usage of AI in high stakes environments like court cases as well as low stakes environments like the summer booklist publication and social media posts have left many questioning AI's role in the world.
Wider effects in society have been seen, like disruption of court cases because of hallucinations from AI tools dealing with usage among lawyers in citations. Another instance was a New York man using an AI avatar for his own court case defense. This has raised many concerns based on AI bias, its susceptibility to fabricating information, and how AI makes mistakes on subjects of varied importance like in writing and law.
Content farms can also suffer from AI cannibalism. This is a process in which large language models (LLMs), models designed for the interpretation of text, speech, translation, and text generation, start to consume the content they created. Over time these text generators can present significant deviation from the original information on which the models were trained. If a content farm uses an LLM to generate text and the LLM is using its own content, its accuracy will fall, leading to misinformation and worse content overall.
Content farms have also been used to intentionally misinform the public and attempt to influence election results. In the 2016 US election, over 140 fake news websites from Veles in North Macedonia portrayed themselves as American websites, and wrote sensationalist articles in an effort to garner more shares on social media. The United States was targeted because US viewers on Facebook have a higher average revenue per user, about 4 times as high as the world average. This revenue potential incentivized writers to create attention-grabbing content they knew would be shared. These content farm articles can often get hundreds of thousands of people to engage in posts.
Similarly, content farms have used bots to create inauthentic reviews of products. This manufactured website traffic encourages advertisers to bid higher prices for website advertising space; most companies have automatized bidding meaning unverified spaces can cost companies a lot of money for no return. It is estimated annually $13 billion dollars is wasted on this advertising.
Search engine responses
Google attempted to lower the rankings of low-quality websites with its Panda update in 2011. DuckDuckGo implemented measures to block low-quality AI-driven sites in 2024.
Content farms have been a problem for ad exchange platforms, and many have policies around them, but enforcement of those policies is rare. NewsGuard found Google to overwhelmingly more likely to serve ads from content farms.
References
References
- Dorian Benkoil. (July 26, 2010). "Don't Blame the Content Farms". PBS.
- Oxenham, Simon. (2019-05-28). "'I was a Macedonian fake news writer'".
- Knibbs, Kate. "That Sports News Story You Clicked on Could Be AI Slop".
- Eichhorn, Kate. (2022). "Content". The MIT Press.
- Roth, Daniel. (October 19, 2009). "The Answer Factory: Demand Media and the Fast, Disposable, and Profitable as Hell Media Model".
- Plesser, Andy. (May 18, 2010). "Yahoo Harvests "Content Farm" Associated Content for $90 Million, Report".
- Rossiter, Jay. (July 2, 2014). "Furthering Our Focus". Tumblr.
- (December 17, 2009). "What It's Like To Write For Demand Media: Low Pay But Lots of Freedom".
- Hiar, Corbin. (July 21, 2010). "Writers Explain What It's Like Toiling on the Content Farm". [[PBS]].
- (July 2, 2023). "People Are Spinning Up Content Farms Using AI".
- Thompson, Stuart A.. (May 19, 2023). "A.I.-Generated Content Discovered on News Sites, Content Farms and Product Reviews". The New York Times.
- Patricio Robles. (April 9, 2010). "USA Today turns to the content farm as the ship sinks". Econsultancy.
- Marr, Bernard. (May 16, 2023). "The Danger of AI Content Farms".
- Arrington, Michael. (December 13, 2009). "The End Of Hand Crafted Content".
- Daily, Laura. (January 13, 2025). ""It's harder than ever to find reliable product recommendations online"". The Washington Post.
- (2025-05-21). "Chicago Sun-Times issues response after publication of fake book list generated by AI".
- "Rise of the Newsbots: AI-Generated News Websites Proliferating Online".
- (2025-06-07). "UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court".
- (2025-05-09). "From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system".
- Prada, Luis. (2025-06-03). "AI Models Are Cannibalizing Each Other—and It Might Destroy Them".
- Alexander, Craig Silverman, Lawrence. (2016-11-03). "How Teens In The Balkans Are Duping Trump Supporters With Fake News".
- Constine, Josh. (2016-04-27). "Facebook swells to 1.65B users and beats Q1 estimates with $5.38B revenue".
- Thompson, Stuart A.. (2023-05-19). "A.I.-Generated Content Discovered on News Sites, Content Farms and Product Reviews". The New York Times.
- "Junk websites filled with AI-generated text are pulling in money from programmatic ads".
- "Finding more high-quality sites in search". Blogspot.
- "The Search Engine Backlash Against 'Content Mills'".
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about Content farm — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report