LLaMa 2 is not open source?

The Open Source Initiative (OSI) has expressed concern about Meta’s use of the term ‘open source’, contending that the license for LLaMa models, particularly LLaMa 2, does not adhere to the OSI’s definition of open source because it restricts commercial use and only allows for certain application areas.

 Animal, Mammal, Llama, Horse

The Open Source Initiative (OSI) has issued a statement accusing Meta of misusing the term ‘open source’, arguing that Meta’s license for LLaMa models, specifically LLaMa 2, does not meet the criteria set out in the open source definition (OSD). According to the OSI, Meta’s license goes against the fundamental principles of open source by restricting commercial usage and certain application areas. The OSI believes that open-source licenses should grant developers full control and ownership over the technology they use and the benefits it brings. They assert that the essence of open source is the ability for everyone to share, regardless of their background.

However, the commercial restriction in the LLAMA COMMUNITY LICENSE AGREEMENT, as outlined in paragraph 2, contradicts this fundamental principle stated in the Open Source Definition. While Meta’s intention to limit the use of LLaMa for competitive purposes is evident, it is important to note that their license does not meet the criteria to be classified as ‘open source’ as defined by the Open Source Definition. According to the OSI, restrictions on the scope of use are not permitted by the OSD since it is impossible to predict the future and the potential positive or negative outcomes. They provide the example of the Linux kernel, widely used across various industries, including medical devices, airplanes, and rockets, to emphasize the positive implications of an unrestricted approach. 

According to the OSI, Meta’s LLaMa license is not open source. The OSI is currently working on defining Open Source AI through a deep dive process. The goal is to present a release candidate for the definition on 17 October 2023.

Why does it matter?

Open-source software matters for AI policy because it makes advanced AI technology accessible to more people, fostering innovation. It helps ensure AI is fair and understandable by providing tools to tackle bias and interpret models. OSS encourages collaboration among researchers and makes AI findings trustworthy. OSS also shapes basic AI practices. Policymakers need to consider OSS’s impact on AI development, fairness, competition, and standards to make informed policies for the AI future.