The Llama 4 Meta model may still be months away

Photo of author

By [email protected]


Last month, dead The first artificial intelligence conference. But while the event gave some prominent improvements to developers, he also felt a little anxious given how important the company’s artificial intelligence is. Now, we know more about the reason, thanks to a in Wall Street Journal.

According to the report, META was originally aimed at release the LLAMATH 4 model Llama 4 at the April developer event, but later delayed in June. Now, it appears to have been pushed again, and perhaps even “autumn or later.” According to the Meta engineers, “to improve the capabilities of” the model that Mark Zuckerberg called “the world’s highest performance model.”

Meta has already released the smaller Lama 4 models, It has also disturbed a fourth lightweight model called “Little Llama”. Meanwhile, the “giant” model will have 288 billion active teachers and “surpassing GPT-4.5, Clauds Sonnet 3.7 and Gemini 2.0 Pro on many scientific standards for scientific marks,” the company said. .

Meta has not given a fixed time schedule for the date of the form of the form. The company said last month that it is “still training.” While Bimoth got a few gestures during the Llamacon keyword, there were no updates about the time that might be really ready. Perhaps this is because there are still several months. There seems to be questions within Meta “about whether the improvements on previous versions are important enough to justify the general version.”

Meta did not immediately respond to a request for comment. As the report notes, it will not be the first company to coincide as it is racing to issue new models and competitors of the average. But the delay is still noticeable due to the noble Meta aspirations when it comes to AI. Made Zuckerberg Amnesty International With descriptive planning to spend a lot On the infrastructure of Amnesty International this year.



https://s.yimg.com/os/creatr-uploaded-images/2025-05/801fdea0-31d6-11f0-affc-8c0b6721c66f

Source link

Leave a Comment