Dropbox, Figma CEOs back Lamini, a startup building a generative AI platform for enterprises | TechCrunch

Dropbox, Figma CEOs back Lamini, a startup building a generative AI platform for enterprises | TechCrunch

Lamini, a Palo Alto-based startup that has built a platform to help enterprises deploy generative AI tech, has raised $25 million from investors including Stanford computer science professor Andrew Ng.

laminiCo-founded several years ago by Sharon Zhao and Greg Dymos, it has an interesting sales pitch.

Zhou and Diamos say many creative AI platforms are too general-purpose, and lack the solutions and infrastructure to meet the needs of corporations. In contrast, Lamini was built from the ground up with enterprises in mind, and is focused on delivering high AI accuracy and scalability.

“Almost every CEO, CIO and CTO’s top priority is to leverage generative AI within their organization with maximum ROI,” Zhou, CEO of Lamini, told TechCrunch. “But while it's easy for an individual developer to get a working demo on a laptop, the road to production is littered with failures left and right.

To Zhou's point, many companies have expressed frustration with the barriers to meaningful adoption of generative AI in their business operations.

According to a March Poll From MIT Insights, only 9% of organizations have adopted generative AI at scale despite 75% experimenting with it. The biggest barriers range from a lack of IT infrastructure and capabilities to poor governance structures, insufficient skills and high implementation costs. Security is also an important factor – in a recent one Survey According to Insight Enterprises, 38% of companies said security is impacting their ability to leverage generative AI tech.

So what is Lamini's answer?

Zhou says that “every piece” of Lamini's tech stack is optimized for enterprise-scale AI workloads, from hardware to software, including model orchestration, fine-tuning, running and Engines used for training. “Optimized” is a vague word, granted, but Lamini takes it a step further with what Zhou calls “memory tuning,” a technique for training a model on data so that it can better understand parts of that data. Can remember exactly.

Memory tuning can potentially reduce DeceptionZhou asserts, or instances when a model generates facts in response to a request.

“Memory tuning is a training paradigm – similar to fine-tuning, but goes further – to train a model on proprietary data that includes key facts, figures and statistics to make the model more accurate, “And can memorize and recall any important information rather than generalize or divulge it,” Nina Wei, an AI designer at Lamini, told me via email.

I'm not sure I buy it. “Memory tuning” seems to be more of a marketing term than an academic one. There are no research papers on this – none that I've been able to come up with at least. I'll leave it to Lamini to show evidence that his “memory tuning” is superior to other hallucination-reduction techniques being attempted.

Fortunately for Lamini, memory tuning isn't its only difference.

Zhao says the platform can operate in highly secure environments, including air-gapped ones. Lamini lets companies run, debug, and train models on a variety of configurations, from on-premises data centers to public and private clouds. And it scales workloads “flexibly,” Chow says, reaching more than 1,000 GPUs if the application or use case demands it.

“The benefits are currently misaligned with the closed-source model in the market,” Zhao said. “We have a goal Put control back in the hands of more people, not just a few, starting with the organizations that care most about control and who have the most to lose from their proprietary data.

Lamini's co-founders are, for what it's worth, quite successful in the AI ​​space. He has also brushed shoulders separately with Ng, which no doubt explains his investment.

Zoe was previously on faculty at Stanford, where she headed a group researching creative AI. Before earning her doctorate in computer science under Ng, she was a machine learning product manager at Google Cloud.

Diamos, for his part, co-founded MLCommons, an engineering consortium dedicated to creating standardized benchmarks for AI models and hardware, as well as the MLPerf, the MLCommons benchmarking suite. He also led AI research at Baidu, where he worked with Ng while the latter was chief scientist there. Diamos was also a software architect at Nvidia's. CUDA The team

The co-founders' industry connections seem to have propelled Lamini on the fundraising front. In addition to Ng, Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy, and — oddly enough — Bernard Arnault, CEO of luxury goods company LVMH, have all invested in Lamini.

AMD Ventures is also an investor (a bit ironic considering Diamos' Nvidia roots), as are First Round Capital and Amplify Partners. AMD got involved early on, supplying data center hardware to Lamini, and today, Lamini runs. Its many models On AMD Instinct GPUs, booking Industry trend.

Lamini makes the big claim that its model training and running performance is comparable to Nvidia's equivalent GPUs for different workloads. Since we are not prepared to test this claim, we will leave it to a third party.

To date, Lamini has raised $25 million in seed and Series A rounds (Series A led by Amplify). Zhou says the money is being used to triple the company's 10-person team, expand its compute infrastructure, and begin development in “deep technical optimization.”

There are many enterprise-oriented, creative AI vendors that can compete with aspects of Lamini's platform, including technology giants like Google, AWS and Microsoft (through its OpenAI partnership). Google, AWS and OpenAI, in particular, have been aggressively pushing the enterprise in recent months, introducing features such as seamless fine-tuning, private fine-tuning on private data, and more.

I asked Zhou about Lamini's customers, revenue and overall speed to market. She wasn't willing to reveal much at this early stage, but she said AMD (via an AMD Ventures tie-in), AngelList and NordicTrack are among Lamini's early (paying) customers, along with several undisclosed government agencies. too

“We're growing fast,” he added. “The number one challenge is serving customers. We've only handled inbound demand because we've been overwhelmed. Given the interest in creative AI, we're not representative of the overall tech slowdown — our peers in the hyped AI world. In contrast, we have gross margins and look like a regular tech company.”

“We believe there is a huge opportunity for creative AI in enterprises. While there are many AI infrastructure companies, Lamini is the first company I've seen that addresses enterprise issues,” said Mike Duber, general partner at Amplify. is taking this seriously and developing a solution that helps enterprises unlock the tremendous value of their private data while satisfying the most stringent compliance and security requirements.”

About the Author

Leave a Reply