Site icon Protechbro: Top Stories on Bitcoin, Ethereum, Web3, & Blockchain

Dropbox, Figma CEOs Support Lamini

Dropbox, Figma CEOs Support Lamini

Stanford professor Andrew Ng, Dropbox CEO Drew Houston, Figma CEO Dylan Field, and OpenAI co-founder Andrej Karpathy invest $25 million in Palo Alto startup Lamini

A Stanford computer science professor, among others, has contributed $25 million to Lamini, a Palo Alto-based startup developing a platform to assist enterprises in deploying generative AI technology. 

Sharon Zhou and Greg Diamos first co-founded Lamini several years ago. The company’s sales proposal is quite intriguing. 

As opposed to having solutions and infrastructure tailored to the requirements of corporations, many generative AI platforms, according to Zhou and Diamos, are far too general-purpose.

On the other hand, Lamini was designed with enterprises in mind from its inception and is committed to providing generative AI with exceptional scalability and precision. 

Zhou, the CEO of Lamini, said;

“Derivative AI implementation with the highest return on investment is the top priority of virtually every CEO, CIO, and CTO”

The road to production, however, is replete with setbacks, whereas it is simple for an individual developer to obtain a functional demo on a laptop. 

Numerous organizations have voiced their discontent regarding the obstacles that impede the enterprise-wide integration of generative AI, as Zhou correctly points out. 

75% of organizations have experimented with generative AI, but only 9% have implemented it on a large scale, according to a March survey by MIT Insights.

From inadequate governance structures and capabilities to insufficient skills, high implementation costs, and a dearth of IT infrastructure and capabilities are among the greatest obstacles.

Significantly influencing the ability to utilize generative AI technology is security concerns, according to a recent survey conducted by Insight Enterprises, which found that 38% of companies were affected. 

Lamini’s response then follows. The engines used to support model orchestration, fine-tuning, running, and training are among “every piece” of Lamini’s technology stack that Zhou claims has been optimized for enterprise-scale generative AI workloads.

Zhou refers to “memory tuning” as a method for training a model to recall portions of the data it was trained on precisely. While “optimized” is admittedly a nebulous term, Lamini is at the forefront of this process. 

Instances in which a model fabricates facts in response to a request or hallucinations, Zhou claims that memory tuning may be able to mitigate. 

In contrast to fine-tuning, memory tuning surpasses it by training a model on proprietary data comprising crucial facts, figures, and numbers.

This enables the model to memorize and recall the exact match of any key information, as opposed to generalizing or hallucinating. Nina Wei, an AI designer at Lamini, explained this via email. 

I am still determining whether I will purchase it. It seems that the term “memory tuning” is not primarily used in academic discourse; at least as far as I could locate, no research papers have been devoted to the subject.

Lamini is free to demonstrate that its “memory tuning” technique is superior to the alternative methods that are currently being explored or have been attempted to mitigate hallucinations. 

Memory adjustment is not the only benefit that Lamini possesses. 

Air-gapped environments are not an impediment to the platform’s operation, according to Zhou. Lamini enables organizations to train, deploy, and optimize models across various infrastructures, including private and public clouds as well as on-premises data centers.

Furthermore, Zhou states that it “elastically” scales duties, which can exceed 1,000 GPUs if the use case or application requires it. 

According to Zhou, incentives for closed source models are presently misaligned in the market.

“Our objective is to restore control to a wider range of individuals, not exclusive to a select few, beginning with organizations that stand to lose the most from proprietary data owned by a third party and place the most value on control.” 

Indeed, the co-founders of Lamini possess considerable expertise in the field of artificial intelligence. Undoubtedly, their investment is also explicable by the fact that they have encountered Ng separately. 

Before joining Stanford as an instructor, Zhou oversaw a research group devoted to generative artificial intelligence. She was a machine learning product manager at Google Cloud prior to obtaining her doctorate in computer science from Ng. 

In addition to co-founding the MLCommons benchmarking suite, MLPerf, Diamos was also a member of the engineering consortium MLCommons, which is committed to developing standard benchmarks for AI models and hardware.

He collaborated with Ng during his tenure as chief scientist at Baidu, where he oversaw AI research. In addition to overseeing CUDA for Nvidia, Diamos was a software architect. 

It seems that Lamini had an advantage in terms of fundraising due to the co-founders’ industry connections. Surprisingly, Bernard Arnault, the CEO of luxury products behemoth LVMH, has also invested in Lamini, alongside Dropbox CEO Drew Houston, Figma CEO Dylan Field, and OpenAI co-founder Andrej Karpathy. 

First Round Capital, Amplify Partners, and AMD Ventures are all investors in Diamos, which is somewhat ironic given the company’s origins at Nvidia.

Lamini implemented a number of its models on AMD Instinct GPUs at present, deviating from the prevailing practice in the industry, thanks to AMD’s early involvement and provision of data center hardware. 

Assigning workload-dependent performance comparable to Nvidia analogous GPUs, Lamini asserts that its model training and execution capabilities are commendable. We shall defer to third parties in evaluating that assertion due to our lack of apparatus to do so. 

Through Series A funding (led by Amplify), Lamini has amassed a total of $25 million to date, including seed and Series A funding.

The funds are being used to triple the company’s ten-person workforce, expand its compute infrastructure, and initiate “deeper technical optimizations” development, according to Zhou. 

Technologies titans such as Google, AWS, and Microsoft (through its OpenAI partnership) are among the many enterprise-oriented, generative AI vendors that could potentially compete with Lamini’s platform.

With the introduction of features such as streamlined fine-tuning, private fine-tuning on private data, and more, Google, AWS, and OpenAI have been aggressively courting the enterprise in recent months. 

Regarding Lamini’s clientele, earnings, and overall market penetration, I inquired with Zhou. She mentioned that AMD (through the AMD Ventures tie-in), AngelList, NordicTrack, and several undisclosed government agencies are among the early (paying) users of Lamini. She was unwilling to divulge much at this juncture. 

She continued, “We are expanding rapidly.” “Serving customers presents the greatest difficulty.” Because of our inundation, we have only dealt with incoming demand.

“Compared to our peers in the hysterical AI industry, our gross margins and burn are more typical of a traditional technology company, in contrast to the widespread perception that our industry is sluggish due to the interest in generative AI” 

“Generative AI represents a tremendous opportunity for businesses,” said Mike Dauber, general partner at Amplify. Among the various AI infrastructure companies that I have encountered thus far, Lamini stands out as the first to genuinely address the challenges faced by enterprises.

Their solution empowers them to harness the immense value of their private data, all the while meeting the most rigorous compliance and security standards.

Exit mobile version