For the first time for the first time in the AI ​​2.0 conversation, voice assistants who understand when they stop, speak, and take turns talking

Photo of author

By [email protected]


Join daily and weekly newsletters to obtain the latest updates and exclusive content to cover the leading artificial intelligence in the industry. Learn more


Artificial intelligence is progressing in a quick section of companies, and this is especially applied to AI’s speech and sound models.

Example of this: Today, the well -funded ElevenLabs, by the well -funded audio, by former Palantir engineers, has started. For the first time the conversation AI 2.0An important upgrade to its platform for building audio agents applying for institutions use, such as customer support, communication centers, sales and external marketing.

This update offers a set of new features designed to create natural, smart and safe reactions, making it perfectly suitable for applications at the level of institutions.

https://www.youtube.com/watch?

The launch comes only four months after the appearance of the original platform, which reflects the eleven commitment of commitment to rapid development, and a day after the start of AI AI Startup, the new competition, AI -based AI Voice Form, EVI 3.

It also comes after new open -source audio models have hit the scene, prompting some influencers of artificial intelligence to announce the dead ElevenLabs. It seems that these ads were, before, the premature.

According to Jozef Marko of the ElevenLabs Engineering team, AI 2.0 Monvenational is much better than its predecessor, setting a new standard for sound -based experiences.

Promote natural speech

One of the most important features of the AI ​​2.0 conversation is its model that revolves around the latest model.

This technology is designed to deal with nuances in human conversation, and to eliminate temporary or interruption that can occur in traditional sound systems.

By analyzing conversation signals such as hesitation and painful filling words, the agent can understand when he speaks and when he listens.

This feature is especially related to applications such as customer service, as agents must balance rapid responses with the natural rhythms of conversation.

Multi -language support

AI 2.0 also offers the conversation to discover the integrated language, providing smooth multi -language discussions without the need for manual composition.

This ability ensures that the agent can identify the language that the user speaks and responds according to the same interaction.

The advantage meets the global institutions that are looking for a consistent service for various customer rules, removing language barriers and enhancing more comprehensive experiences.

Foundation’s degree

One of the most powerful additions is the integrated generation system (RAG). This feature of Amnesty International allows access to external knowledge rules and immediately restored information, while maintaining minimal cumin and protecting strong privacy.

For example, in healthcare settings, this means that a medical assistant agent can withdraw treatment guidelines directly from the Foundation database without delay. In customer support, agents can access the details of modern product from internal documents to help users more effectively.

Multiple media and alternative people

In addition to these basic features, it supports the new primary system for ElevenLabs pluralism, which means that agents can communicate through sound, text or a group of two. This flexibility reduces the engineering burden on developers, where agents must only be determined once to work via different communication channels.

More enhancement of an agent expression, AI 2.0 offers a multi -letter mode, allowing one factor of switching between different people. This capacity can be valuable in scenarios such as developing creative content, training simulation or customer participation campaigns.

The batch is issued to call

For institutions that look forward to widespread awareness automation, the platform now supports payment calls. \

Institutions can start multiple external calls simultaneously using the AI ​​agents of the conversation, which is a good approach to investigative studies, alerts and personal messages.

This feature aims to increase access and operational efficiency, providing a more developmental alternative to manual external efforts.

Standards and plans of pricing at the institution level

In addition to the features that enhance communication and participation, the AI ​​2.0 conversation puts a strong focus on confidence and compliance. The platform is fully compatible with HIPAA, a decisive requirement for healthcare applications that require strict privacy and data protection. It also supports the optional residence of European Union data, and is compatible with the requirements of data sovereignty in Europe.

ElevenLabs boosts these features that focus on compliance with safety and reliability at the level of the institution. It is designed for high availability and integration with third -party systems, and the AI ​​2.0 conversation is placed as a secure reliable option for companies working in sensitive or organized environments.

As far as it comes to pricing, here are the available subscription plans that include the artificial intelligence of the conversation Currently included on ElevenLabs’:

  • free: $ 0/month, includes 15 minutes, 4 synchronization, requires support and not a commercial license.
  • beginning: 5 dollars per month, includes 50 minutes, 6 synchronization.
  • creator: 11 dollars per month (reduced from $ 22), and includes 250 minutes, 6 coincidences, ~ 0.12 dollars per additional minute.
  • Professional: 99 dollars per month, including 1100 minutes, 10 extent coincidence, ~ 0.11 dollars per additional minute.
  • size: 330 dollars per month, includes 3600 minutes, 20 ends for synchronization, ~ $ 0.10 per additional minute.
  • a job: $ 1,320 per month, includes 13,750 minutes, 30 extent coincidence, ~ 0.096 dollars per additional minute.

A new chapter in realistic and natural audio reactions

As mentioned in The company’s video presents the new version“The conversation’s artificial intelligence capabilities were not bigger. It is time to build now.”

With AI 2.0 conversation, ElevenLabs aims to provide tools and infrastructure for institutions to create smart voice agents and eyes that really raise the level of digital reactions.

For those interested in knowing more, ElevenLabs encourages developers and institutions to explore their documents, visit the developer portal, or communicate with the sales team to know how AI 2.0 can enhance customer experiences.



https://venturebeat.com/wp-content/uploads/2025/05/cfr0z3n_minimalist_cheerful_simple_charcoal_pen_illustration__4c764cd5-177c-4f37-a56d-4853083d9e06_0-1.png?w=1024?w=1200&strip=all
Source link

Leave a Comment