EnararGure AiAn emerging company from AI, which raised 144 million dollars so far, announced Enragne en100,
AI accelerator is based on accurate and developed analog computing.
It is designed to bring advanced artificial intelligence capabilities to laptops, business stations and edge devices, En100
It enhances the transformational efficiency of the presentation of 200 peaks (a scale for artificial intelligence performance) of the total account force within the energy restrictions of edge and union platforms such as laptops.
The company came out of the University of Princeton on a bet that its analog memory chips will accelerate the processing of artificial intelligence and reduce costs as well.
“EN100 represents a fundamental transformation in the computing structure of artificial intelligence, rooted in the innovations of devices and programs that were involved through basic research that extends over multiple generations of silicone development,” said Naveen Verma, CEO of Engring Ai, in a statement. “These innovations are now available as industry products for use, such as developed and programmed inference solutions that penetrate the energy -saving boundaries for digital solutions today. This means that AI is advanced, safe and personal that can be operated locally, without relying on cloud infrastructure. We hope this radically expands what you can do with AI.”
Previously, the models that lead the next generation of artificial intelligence economy – Multimodal and Relearing – took place – the power center processing power. The cost of cloud dependency, cumin, and safety defects made countless artificial intelligence applications.
En100 broke these restrictions. By mainly reshaping as artificial intelligence reasoning occurs, developers can now publish advanced, safe and personal applications locally.
The company said that this penetration enables institutions to quickly integrate advanced capabilities into current products-which gives strong artificial intelligence technologies and create a high-performance conclusion directly to the final users.
EN100, which is the first series of ENRARGEN EN chips, is an improved structure that efficiently treats artificial intelligence tasks with energy reduction. Available in two M.2 factors for laptops and PCIE for business stations-EN100 is designed to convert the capabilities of the device:
● M.2 for laptops: It allows the connection of up to 200+ from the capacity of artificial intelligence account in the power plant 8.25W, en100 M.2 advanced AI applications on laptops without prejudice to the battery life or transmission.
● PCIE for labor stations: It features four NPU units that reach about 1 PETAOPS.
ENRARGE AI’s comprehensive programs set fully support for the platform across the advanced model scene with maximum efficiency. This ecosystem designed for this purpose combines specialized improvement tools, high-performance performance assembly, and large-scale development resources-all of which support popular frameworks such as Pytorch and Tensorflow.
Compared to competing solutions, EN100 shows up to 20x better performance for per watt via different work burdens of artificial intelligence. With up to 128 GB of high -density LPDDR memory and 272 GB/second, EN100 deals efficiently the advanced AI tasks, such as obstetrics and computer vision models, which usually require specialized data center devices. EN100 programming ensures improved performance of today’s artificial intelligence models and the ability to adapt to artificial intelligence models tomorrow.
“The real magic of EN100 is that it makes the transformative efficiency to infer artificial intelligence within the reach of our partners, which can be used to help them achieve the ambitious Amnesty International Road Maps,” says Ram Rangarjan, the first Vice President of the Ennging Ai. “For customer platforms, EN100 can bring AI’s sophisticated capabilities on the device, providing a new generation of smart applications that are not only faster and more responsive but also safer and customized.”
Early adoption partners have already begun to work closely with Engrice to clarify how En100 is to provide Amnesty International Transformational Experiences, such as multimedia intelligence agents and improved games that make realistic environments in actual time.
Although the first round of EN100 ‘early access program is currently full, interested developers and institutions concerned with electronic materials can subscribe to learn more about the upcoming round 2 program, which provides a unique opportunity to gain a competitive advantage by being among the first En100 commercial applications on www.encharge.ai/en100.
a race
Engrich does not compete directly with many big players, where we have a little different concentration and strategy. Our approach gives priority to the growing AI and EDGE hardware market, as our energy efficiency feature is the most convincing, rather than competing directly in the data center markets.
However, English has some advantages that make it a unique competitiveness in the chips scene. For one of them, the English slide has a greatly higher energy efficiency (about 20 times larger than prominent players. The slide can run the most advanced artificial intelligence models that are used around energy such as the light bulb, making it a very competitive offer for any use that cannot be limited to the data center.
Second, the computing approach to the analog memory of English makes its chips more intense than the traditional digital structure, with approximately 30 peak/mm 2 versus 3. This allows customers to pack more artificial intelligence processing power in the same material space, which is of special value for laptops, smartphones and other devices where the area is in a distinct place. The original equipment manufacturers can integrate the powerful AI capabilities without compromising the size of the device, weight or model factor, allowing them to create more elegant and more compact products while continuing to provide advanced AI features.
Assets

In March 2024, the English company has made a partnership with the University of Princeton to secure a $ 18.6 million grant of Technology Technology Optimum DarPa within the optima optima is an effort worth 78 million dollars to develop fast and defensive loads that cannot be achieved with current technology.
Enraght inspiration came from the decisive challenge in artificial intelligence: the inability of traditional computing structures to meet the needs of artificial intelligence. The company was established to solve the problem, with the growth of artificial intelligence models significantly in size and complexity, combating traditional chips (such as graphics processing units) to keep pace, leading to both memory and treatment of bottlenecks, as well as the related energy requirements. (For example, one large language model can consume a large amount of electricity used by 130 American families in one year.)
The specified technical inspiration arose from the work of the Founder of the Ingrich, Naveen Verma, and his research at Princeton University in computing structures of the next generation. He and his assistants spent more than seven years exploring a variety of innovative computing structures, which leads to a penetration in analog computing in memory.
This approach aims to significantly enhance energy efficiency of the burdens of artificial intelligence with mitigation of noise and other challenges that hindered previous analog computing efforts. This artistic achievement, which has proven and increased sincerely for multiple generations of silicon, was the basis for the founding of AI to market analog computing solutions in memory to infer artificial intelligence.
AI AI was launched in 2022, led by a team with a semiconductor experience and artificial intelligence system. The team comes out of Princeton University, focusing on an analog and analog Amnesty International conclusion and accompanying programs.
The company was able to overcome the previous obstacles on the structures of the analog chips and memory by taking advantage of the exact capacitors to replace the metal wires instead of the transistors exposed to the noise. The result is a fully structure that reaches 20 times more efficient in using energy, which is currently available or digital digital chips available soon.
With this technique, the English changes mainly from how artificial intelligence calculates. Their technology greatly reduces energy requirements to calculate artificial intelligence, which leads to the advanced burdens of AI’s work from the data center, on laptops, workstations and edge devices. By transferring the inference of artificial intelligence near the place where the data is created and used, EnraRGN offers a new generation of devices and applications that have been enabled from artificial intelligence that were previously impossible due to energy, weight or size restrictions while improving safety, cumin and cost.
Why do it matter

Since artificial intelligence models have grown significantly in size and complexity, the requirements of its chip and the related energy increased. Today, the vast majority of the inference account of artificial intelligence is accomplished with huge groups of density chips that are independent in cloud data centers. This creates the cost, cumin and safety to apply artificial intelligence to use cases that require the calculation of the device.
Only with the conversion increases in the efficiency of the account, Amnesty International will be able to get out of the data center and process the AI’s use of devices that are size, weight and restricted energy or have cumin or privacy requirements that benefit from preserving local data. It can be to reduce cost and access to advanced artificial intelligence barriers, enormous effects on a wide range of industries, from consumer electronics to space and defense.
Dependence on data centers also presents the bottleneck risk. The increase by AI in demand for advanced GPUs (GPUS) can increase the total demand for certain components of the source by 30 % or more by 2026. However, the increase in demand is about 20 % or more has a high probability of balancing balance and causing a shortage of chips. The company already sees this in the tremendous costs of the latest graphics processing units and waiting lists that lasted for years as a small number of artificial intelligence companies, to buy all available shares.
Environmental requirements and energy requirements for these data centers are also not sustainable with current technology. Energy use to search one Google one has increased more than 20x from 0.3 watts an hour to 7.9 watts an hour with the addition of artificial intelligence to search for energy. In total, the International Energy Agency (IEA) says that the electricity that focuses on data in 2026 will double the 2022 – 1K Terawts, which is almost equivalent to the current current consumption in Japan.
Among the investors are Tiger Global Management, Samsung Ventures, IQT, RTX Ventures, Venturetech Alliance, Anzu Partners, Venturetech Alliance, Allycorp and Acvc Partners. The company has 66 people.
https://venturebeat.com/wp-content/uploads/2025/05/encharge.jpg?w=1024?w=1200&strip=all
Source link