WekaIO Inc., a business that has closed a $140 million late-stage investment, has a data management platform that can store billions of files.
The software developer stated that a secondary stock offering was part of the Series E round in addition to equity funding. The financing was led by current investor Valor Equity Partners at a valuation of $1.6 billion. Nearly a dozen additional investors, including Hitachi Ventures, Qualcomm Ventures, and Nvidia Corp., joined it.
Organizations use a variety of storage infrastructure types to distribute their information. A company’s artificial intelligence models are probably trained on data stored on expensive, high-speed flash devices. Lower-priority records are usually placed on less expensive hardware, and backups are kept in a different storage environment with specially designed technical specifications of their own.
Many times, a single data management platform powers all of a company’s storage infrastructures. This makes their job harder for administrators because they have to switch between many administration interfaces.
Weka says that businesses may deploy a single software layer throughout all of their different storage systems by using their data management platform. The platform can manage an organization’s backups and operate on both on-premises and cloud infrastructure. Weka claims that all of those environments can be controlled by a single administrator via a centralized interface, enabling them to manage multiple exabytes of data.
The company’s primary goal use case is to power workloads related to artificial intelligence. AI training datasets are commonly made up of a huge number of relatively small items, like webpages, which are frequently kept in various forms. Weka claims that its platform can hold up to billions of files with unstructured and organized data in them.
“Unstructured data management is a common data management challenge in the AI era, so enterprise data stacks must be able to handle a variety of IO patterns, data types and sizes at ever-increasing volumes and velocity,” Weka co-founder and Chief Executive Officer Liran Zvibel (pictured, right, with co-founder Maor Ben Dayan) wrote in a blog post today.
There are several AI-focused performance enhancements available on the Weka platform. Generally, a copy of the data for the central processing units of the host servers must be made in order to transport data between the Nvidia graphics cards that power AI models. Weka is compatible with Nvidia’s GPUDirect RDMA technology, which eliminates the need to make these copies and accelerates data transfers.
Performance dips can occur for AI models and other applications when other workloads in the same environment utilize too much hardware. Weka claims to have a feature called Converged Mode that will help with that problem. The company claims that it enables managers to guarantee that a critical job will always have access to a set quantity of RAM, storage, and other hardware resources.
“With the arrival of gen AI, enterprise data requirements are becoming increasingly complex, while expectations for speed are rising,” Zvibel detailed. “GPUs and modern networking have improved their performance by four to five orders of magnitude over the last decade, and they require much faster access to data to ensure they are balanced.”
Weka claims that more than 300 organizations are a part of their installed base. These clients include government organizations, academic institutions, and businesses like Samsung Electronics Co. Ltd. Weka states that one of its clients, the University of Surrey, was able to use its platform to accelerate some AI workloads by a factor of eight.
According to TechCrunch, the company’s install base generates more than $100 million in annually recurring revenue for it. By December, Weka hopes to achieve cash flow positivity. The software developer plans to construct further features and give early staff liquidity using the money raised from this second investment round.