It seems that the week for small artificial intelligence models, apparently.
On Thursday, AI2, the Institute of Non -Performing Artificial Intelligence Research, Absolute OLMO 2 1B, a model of 1 billion teachers claiming AI2 surpasses the similar sized models of Google, Meta and Alibaba on several criteria. The parameters, sometimes referred to as weights, are the internal components of the model that directs its behavior.
OLMO 2B 1B is available under APache 2.0 license tolerant on the AI Dev platform that embraces its face. Unlike most models, Olmo 2 1b can be repeated from scratch; AI2 provided code and data groups (Olmo-Mix-1124and Dolmino-Mix-1124Used to develop it.
Small models may not be able to work, but it is important, they do not require the operation of fat devices. This makes them easier for developers and amateurs who compete with the restrictions of low consumer machines and consumers.
There has been a set of small models launching over the past few days, from Microsoft’s Fi 4 logical family to QWEN’s 2.5 OMNI 3B. Most of this can work – and OLMO 2 1B – easily on a modern laptop or even a laptop.
AI2 says that olmo 2 1b has been trained in a collection of data of 4 trillion symbols from sources available to the public, generating artificial intelligence, and manually created. The symbols are raw bits of data models that consume and generate – a million symbols equivalent to about 750,000 words.
In the standard measurement of the measurement of arithmetic thinking, GSM8K and Olmo 2 1B is better than Google Gemma 3 1B, Meta’s Llama 3.2 1B and Alibaba’s QWEN 2.5 1.5B. OLMO 2 1B also outperforms the performance of these three models on Reallyfulqa, a test for realistic accuracy.
TECHRUNCH event
Berkeley, California
|
June 5
AI2 warns that olmo 2 1b carries risks. Like all artificial intelligence models, “problematic outputs” can produce “sensitive” harmful content, as the Foundation says, as well as inaccurate data in reality. For these reasons, AI2 recommends against the publication of Olmo 2 1B in commercial settings.
https://techcrunch.com/wp-content/uploads/2024/10/GettyImages-1335295270.jpg?resize=1200,675
Source link