Join daily and weekly newsletters to obtain the latest updates and exclusive content to cover the leading artificial intelligence in the industry. Learn more
For anyone in artificial intelligence, it is not a big news that “data is the real prize.” If you have the foundations of strong data, the models and applications in which you work will be right.
But this is the place where chaos becomes. Building this basis is not part of the cake, especially when there are dozens of data sources, each hosting valuable information. You need to create and integration pipeline for each source – a huge engineering burden for data teams that disrupt the different ETL tools to focus what is required to run the burdens of artificial intelligence work. On a large scale, these pipelines become solid bottlenecks – difficult to adapt, extend or expand.
Snowfall He is believed to have an answer.
Today, at its annual summit, the company announced its general availability of OpenFlow – a service that is completely swallowed up to the entirely managed data that withdraws any kind of data from almost any source, which simplifies the process of filling the information to spread AI’s fast.
How do you work?
Supported by Apache NIFI, OpenFlow uses conductors – a debt or dedicated – with governance and security built into Snowflake. Whether it is a non -organized multimedia content from a box or events in the actual time, OpenFlow plugs, unifying, make all types of data available easily in Cloud Data Data Snowflake.
“Data engineers often faced a decisive comparison – if they want very controlled pipelines, they have complicated and managed a large infrastructure. If they wanted a simple solution, they have faced limited, privacy, flexibility and customization issues.
While Snowflake has provided swallowers such as SNOWPIPE for broadcasting or individual conductors, OpenFlow offers “a comprehensive solution effortlessly to receive almost all institutions data.”
“Snowflake’s ice and secondary flow is still a major basis for customers who bring data to snowflake, focus on” download “ETL. OpenFlow, on the other hand, deals with data extract directly from source systems, then leads to conversion and downloads as well. It is integrated with the new snow broadcasting structure, so data can be flowing in snow once it is clarified.
This ultimately opens new use cases where AI can analyze a complete image of the institution’s data, including documents, photos and events in actual time, directly inside the snowflake. Once the visions are extracted, they can return to the source system using the conductor.
More than 200 connectors are available

OpenFlow currently supports more than 200 connectors and processors ready for use, covering services such as Box, Google ADS, Microsoft SharePoint, Oracle, Salesforce Data Cloud, Workday and Zendesk.
“Box integration with Snowflake OpenFlow … benefits from extracting data from the square using the AI square, honors the original permissions for safe access, nourishes the data in the snowflake for analysis. It also enables the two -way flow, where customized visions or descriptive data can be written.
It takes only a few minutes to create new conductors, which speeds up the value of value. Users also get safety features such as roles -based licensing, transit encryption, and secret management to keep data protected from one to tip.
“Institutions that require data integration in actual time, or deal with large quantities of data from different sources, or depend on unsheather data such as images, sound and video to extract value will benefit greatly from OpenFlow.” For example, the retail company can unify the horrific data of sales, e -commerce, CRM and social media to provide customized experiences and improved operations.
Snowflake Irwin, Securonix and WorkWave clients are among those who use OpenFlow to move and expand global data scope – although the company has not revealed accurate adoption numbers.
What next?
As the next step, Snowflake aims to make the open flow of the backbone of smart data in actual time across the distributed systems-which occupies the age of artificial intelligence agents.
“We focus on the transmission of events on a large scale and enabled in actual time, an agent to an agent, so visions and procedures flow smoothly across the distributed systems. For example, the agent of the crust delivers events to the agents of other institutions from other systems, such as service,” said Childe.
The schedule of these promotions is still clear at the present time.
https://venturebeat.com/wp-content/uploads/2024/03/data_pipelines_crisscrossing_each_other_in_outer-transformed.jpeg?w=1024?w=1200&strip=all
Source link