Skip to content
Surf Wiki
Save to docs
technology/computing

From Surf Wiki (app.surf) — the open knowledge base

Allen Institute for AI

U.S. research institute for computer science


U.S. research institute for computer science

FieldValue
nameAllen Institute for AI
logo[[File:Allen Institute for Artificial Intelligence.svg250px]]
typeNon-profit research institute
tax_id82-4083177
formation
founderPaul Allen
location_citySeattle, Washington
location_countryU.S.
key_peoplePeter Clark, Yejin Choi, Noah Smith, Hannaneh Hajishirzi, Dan Weld, Chris Bretherton, Ani Kembhavi, Jes Lefcourt
leader_titleCEO
leader_nameAli Farhadi
website

The Allen Institute for AI (abbreviated Ai2) is a 501(c)(3) non-profit scientific research institute founded by late Microsoft co-founder and philanthropist Paul Allen in 2014. The institute seeks to conduct high-impact AI research and engineering in service of the common good. Ai2 is based in Seattle, and also has an active office in Tel Aviv, Israel.

History

Oren Etzioni was appointed by Paul Allen in September 2013 to direct the research at the institute. After leading the organization for nine years, Etzioni stepped down from his role as CEO on September 30, 2022. He was replaced in an interim capacity by the leading researcher of the company's Aristo project, Peter Clark. On June 20, 2023, Ai2 announced Ali Farhadi as its next CEO starting July 31, 2023.

Teams

  • Aristo: Aristo is a flagship project of Ai2. Its original project goal was to design an artificially intelligent system that could successfully read, learn, and reason from texts and ultimately demonstrate its knowledge by successfully passing an 8th-grade science exam – the team achieved this objective in 2018. It was inspired by a similar project called Project Halo carried out by Seattle-based investment company Vulcan. The current focus of the team is to build the next generation of systems that can systematically reason, explain, and continually improve over time.
  • PRIOR: The PRIOR team seeks to advance the field of computer vision by creating AI systems that can see, explore, learn, and reason about the world. The team released the open embodied AI platform AI2-THOR in 2016, supporting the training of AI agents in simulated environments. In February 2018, the team released the game Iconary as a demonstration of an AI that can understand and produce situated scenes from a limited set of icons.
  • Semantic Scholar: Semantic Scholar tool is an artificial-intelligence backed search engine for academic publications publicly released in November 2015. It uses advances in natural language processing to provide features such as summaries for scholarly papers, contextual information about inline citations, and the ability to create libraries of papers and receive paper recommendations.
  • AllenNLP: The AllenNLP team works on research to improve NLP systems' performance and accountability, and advance scientific methodologies for evaluating and understanding NLP systems. The team produces its own research as well as open-source tools to accelerate NLP research.
  • MOSAIC: The Mosaic project is focused on defining and building common sense knowledge and reasoning for AI systems.
  • AI for the Environment: These teams seek to apply artificial intelligence solutions to the prevention of poaching and illegal fishing in locations around the world, as well as environmental problems like climate modeling and wildfire management. The teams in this group include EarthRanger, Skylight, Climate Modeling, and Wildlands.

Generative AI

Ai2 has been actively involved in the development of open-source artificial intelligence through the release of fully open large language models, datasets, and model training assets.

Olmo model family

On May 11, 2023, Ai2 announced they were developing Olmo, an open language model aiming to match the performance of other state-of-the-art language models. In February 2024, 1B and 7B parameter variants of the model were open-sourced, including code, model weights with intermediate snapshots and logs, and contents of their Dolma training dataset, making it the most open state-of-the-art model available.

In November 2024, Ai2 released the second iteration of Olmo, OLMo 2, with the initial release including 7B and 13B parameter models. In March 2025, Ai2 released a 32B variant of OLMo 2, claiming to have released "the first fully-open model (all data, code, weights, and details are freely available) to outperform GPT3.5-Turbo and GPT-4o mini".

In November 2025, Ai2 released their Olmo 3 set of models. These included the Olmo 3 Think (7 and 32B), Olmo 3 Base (7 and 32B), Olmo 3 Instruct (7B) and Olmo 3 RL Zero (7B). In December 2025, Ai2 announced an update to the 32B model, Olmo 3.1.

Tulu models and post-training recipes

In addition to the fully-open OLMo family of models, Ai2 has also developed Tulu, a family of instruction-tuned models and open post-training recipes that build on open-weights base models (e.g., Meta's Llama) to provide fully transparent alternatives to proprietary instruction-tuning methods. AI2 released the first iteration of Tulu in June 2023, with subsequent iterations being released in November 2023 (Tulu 2) and November 2024 (Tulu 3).

References

References

  1. "About — Allen Institute for AI".
  2. "2019 Year in Review".
  3. Cook, John. (2013-09-04). "Going beyond Siri and Watson: Microsoft co-founder Paul Allen taps Oren Etzioni to lead new Artificial Intelligence Institute".
  4. Schlosser, Kurt. (2022-06-15). "Oren Etzioni stepping down as CEO of Allen Institute for AI after nine years at research hub".
  5. Bishop, Todd. (2023-06-20). "Apple machine learning leader Ali Farhadi named CEO of Allen Institute for Artificial Intelligence". [[GeekWire]].
  6. "Aristo — Allen Institute for AI".
  7. "Aristo — Allen Institute for AI".
  8. "PRIOR".
  9. "AI2-THOR". Allen Institute for AI.
  10. Rodriguez, Jesus. (2021-07-08). "🔹🔸Edge#104: AllenNLP Makes Cutting-Edge NLP Models Look Easy".
  11. "Semantic Scholar {{!}} Product".
  12. "AllenNLP — Allen Institute for AI".
  13. Dormehl, Luke. (2018-04-13). "Forget Cloning, A.I. is the Real Way to Let Your Family Pooch Live Forever".
  14. (2024). "OLMo: Accelerating the Science of Language Models".
  15. (2024). "Dolma: an Open Corpus of Three Trillion Tokens for Language Model Pretraining Research".
  16. (2023-05-18). "Announcing AI2 OLMo, an open language model made by scientists, for scientists".
  17. Wiggers, Kyle. (2024-02-01). "AI2 open sources text-generating AI models -- and the data used to train them".
  18. "OLMo 2: The best fully open language model to date {{!}} Ai2".
  19. (2025). "2 OLMo 2 Furious".
  20. "OLMo 2 32B: First fully open model to outperform GPT 3.5 and GPT 4o mini {{!}} Ai2".
  21. "Olmo 3: Charting a path through the model flow to lead open-source AI {{!}} Ai2".
  22. David, Emilia. "Ai2's new Olmo 3.1 extends reinforcement learning training for stronger reasoning benchmarks".
  23. Coldewey, Devin. (2024-11-21). "Ai2's open source Tülu 3 lets anyone play the AI post-training game".
  24. (2023). "How Far Can Camels Go? Exploring the State of Instruction Tuning on Open Resources".
  25. (2023). "Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2".
  26. (2025-04-14). "Tulu 3: Pushing Frontiers in Open Language Model Post-Training".
Info: Wikipedia Source

This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.

Want to explore this topic further?

Ask Mako anything about Allen Institute for AI — get instant answers, deeper analysis, and related topics.

Research with Mako

Free with your Surf account

Content sourced from Wikipedia, available under CC BY-SA 4.0.

This content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.

Report