Build or buy? Scale your enterprise general artificial intelligence in 2025

Photo of author

By [email protected]


This article is part of VentureBeat Magazine’s special issue, “AI at Scale: From Vision to Feasibility.” Read more from this special issue here.

This article is part of VentureBeat Magazine’s special issue, “AI at Scale: From Vision to Feasibility.” Read more of the case here.

Expand adoption Generative tools It has always been a challenge to balance ambition with practicality, and in 2025, the stakes are higher than ever. Companies racing to adopt large language models (LLMs) face a new reality: scaling is not just about deploying larger models or investing in cutting-edge tools, it is also about integrating AI in ways that transform operations, empower teams, and optimize costs. Success depends on more than technology; It requires a cultural and operational transformation that aligns AI capabilities with business goals.

The inevitability of expansion: why is 2025 different?

As generative AI evolves from the pilot phase to enterprise-wide deployments, companies are facing an inflection point. The excitement of early adoption has given way to the practical challenges of maintaining efficiency, managing costs, and ensuring relevance in competitive markets. Scaling AI in 2025 is about answering tough questions: How can companies make generative tools impactful across departments? What infrastructure will support AI growth without throttling resources? And perhaps most importantly, how do teams adapt to AI-driven workflows?

Success depends on three critical principles: identifying clear, high-value use cases; Maintain technological flexibility; Strengthening a workforce equipped to adapt. Successful organizations are not only embracing new generation AI, they are developing strategies that align the technology with business needs, and are continually reevaluating the costs, performance and cultural shifts required to achieve sustainable impact. This approach is not limited to just deploying cutting-edge tools; It’s about building operational flexibility and scalability in an environment where technology and markets are evolving at breakneck speed.

Companies like Wayfair and Expedia Embody these lessons, and show how a hybrid approach to LLM adoption can transform operations. By blending external platforms with custom solutions, these companies demonstrate the power of balancing flexibility and precision, setting a model for others.

Combining customization and flexibility

The decision to build or buy AGI tools is often portrayed as a binary decision, but Wayfair and Expedia demonstrate the benefits of a nuanced strategy. Fiona Tan, CTO at Wayfair, stresses the value of balancing flexibility and privacy. Wayfair uses Google Vertex Artificial Intelligence For general applications while developing special tools for specialized requirements. Tan shared the company’s iterative approach, sharing how smaller, cost-effective models often outperform larger, more expensive options in marking product attributes such as fabric and furniture colours.

Likewise, Expedia uses a multi-vendor LLM agent layer that enables seamless integration between different models. Rajesh Naidu, Expedia’s senior vice president, describes their strategy as a way to maintain flexibility while optimizing costs. “We are always opportunistic, looking for the best (models) where it makes sense, but we are also willing to build in our own space,” Naidu explains. This flexibility ensures the team can adapt to evolving business needs without being tied to a single resource.

These hybrid approaches are reminiscent of the evolution of enterprise resource planning (ERP) in the 1990s, when organizations had to choose between adopting rigid, unconventional solutions and highly customized systems to fit their workflow. Then, as now, successful companies recognized the value of combining external tools with developments tailored to address specific operational challenges.

Operational efficiency of core business functions

Both Wayfair and Expedia demonstrate that the real power of LLM lies in targeted applications that deliver measurable impact. Wayfair uses generative AI to enrich its product catalog, enhancing metadata with independent accuracy. This not only streamlines workflow but also improves search and customer recommendations. Tan highlights another transformative application: leveraging MBAs to analyze legacy database structures. With original system designers unavailable, AGI enables Wayfair to mitigate technical debt and uncover new efficiencies in legacy systems.

Expedia has had success integrating artificial general intelligence across its customer and developer service workflows. Naidu notes that a custom AI tool designed to summarize calls ensures that “90% of travelers can reach an agent within 30 seconds,” contributing to a significant improvement in customer satisfaction. In addition, GitHub Copilot is deployed enterprise-wide, speeding up the code generation and debugging process. These operational gains underscore the importance of aligning AGI capabilities with clear, high-value business use cases.

The role of hardware in artificial general intelligence

Hardware considerations related to scaling LLM programs are often overlooked, but play a critical role in long-term sustainability. Both Wayfair and Expedia currently rely on cloud infrastructure to manage their public AI workloads. Tan notes that Wayfair continues to evaluate the scalability of cloud providers like Google, while monitoring the potential need for on-premises infrastructure to handle real-time applications more efficiently.

Expedia’s approach also emphasizes flexibility. Primarily hosted on OsThe company uses a proxy layer to dynamically route tasks to the most appropriate computing environment. This system balances performance with cost efficiency, ensuring that inference costs never get out of control. Naidu highlights the importance of this adaptability as enterprise AI applications grow more complex and require higher processing power.

This focus on infrastructure reflects broader trends in enterprise computing, reminiscent of the shift from monolithic data centers to microservices architectures. As companies like Wayfair and Expedia expand their LLM capabilities, they showcase the importance of balancing cloud scalability with emerging options like edge computing and custom chips.

Training, governance and change management

Deploying LLMs is not just a technological challenge – it is a cultural challenge. Both Wayfair and Expedia emphasize the importance of enhancing organizational readiness to adopt and integrate general AI tools. At Wayfair, comprehensive training ensures that employees across departments can adapt to new workflows, especially in areas like customer service, where AI-generated responses require human oversight to match the company’s voice and tone.

Expedia has taken a step forward in governance by creating an AI Responsible Council to oversee all major AI-related decisions. This council ensures that deployments are aligned with ethical guidelines and business objectives, enhancing trust across the organization. Naidu stresses the importance of rethinking the metrics to measure the effectiveness of AGI. Traditional KPIs often fail, pushing Expedia to embrace precision and call out the metrics that best align with business goals.

These cultural adjustments are essential to the long-term success of AI in enterprise settings. Technology alone cannot drive transformation; The transformation requires a workforce equipped to leverage the capabilities of AGI and a governance structure that ensures responsible implementation.

Lessons for success in scaling

The experiences of Wayfair and Expedia offer valuable lessons for any organization looking to effectively scale its MBA. Both companies demonstrate that success hinges on identifying clear business use cases, maintaining flexibility in technology choices, and fostering a culture of adaptation. Their hybrid approaches provide a model for balancing innovation and efficiency, ensuring that AGI investments deliver tangible results.

What makes expanding the scope of artificial intelligence in 2025 an unprecedented challenge is the pace of technological and cultural change. The hybrid strategies, resilient infrastructures, and strong data cultures that define successful AI deployments today will lay the foundation for the next wave of innovation. Companies building these foundations now will not only scale AI; They will expand flexibility, adaptability and competitive advantage.

Looking ahead, the challenges of inference costs, real-time capabilities, and evolving infrastructure needs will continue to shape the enterprise AGI landscape. As Naidu aptly said: “Gen AI and LLMs will be a long-term investment for us and have differentiated us in the travel space. We have to keep in mind that this will require some conscious investment prioritization and understanding of use cases.”



https://venturebeat.com/wp-content/uploads/2025/01/teal-Build_or_buy__Scaling_your_enterprise_gen_AI_pipeline_in_2025-transformed.jpeg?w=1024?w=1200&strip=all
Source link

Leave a Comment