categories

HOT TOPICS

1Mby1M Virtual Accelerator AI Investor Forum: With Benjamin Narasin, Founder and General Partner at Tenacity Venture Capital (Part 2)

Posted on Saturday, Nov 9th 2024

Sramana Mitra: Before we move on to the next example, Ben, what kinds of insights are being tracked by this AI chief of staff?

Benjamin Narasin: While it’s designed to be an all-knowing chief of staff across the organization, they’re initially focusing on the engineering stack. It integrates with various systems like GitHub, for example. Imagine it as a virtual assistant that’s always monitoring the department. For instance, it can detect an increase in support tickets for a specific area, notice if a project is slowing down, or if a high-performing engineer or team shows a change in productivity. In a way, it acts like a “hall monitor” for different business units. They’re starting with engineering, but one of the founders I know is using it across his entire company, even though it’s not fully optimized for that yet.

This reminds me of a play-turned-movie, Other People’s Money, which featured a corporate raider with a computer assistant that advised him each morning on business opportunities. He’d ask it, “What do I look at today?” and it would highlight a company, explaining why it was a good buy. That’s essentially what this AI is doing for business insights—observing continuously, so it can act like a proactive assistant across the organization.

Sramana Mitra: It sounds like it’s tracking activity against certain KPIs.

Benjamin Narasin: Well, the advantage of an LLM over traditional systems is that it doesn’t need predefined KPIs. It learns independently by observing and can identify patterns we might not even consider. For example, maybe a feature that was stable starts behaving differently after an update—this system could detect that. This is the power of LLMs; they aren’t restricted by explicit instructions or training data and can find insights on their own.

Sramana Mitra: Let’s move to your next example.

Benjamin Narasin: The second company is actually a company that one of my founders I had funded out of Tenacity whose business was not able to raise a series A ended up joining. It called SF Compute, or San Francisco Compute. It started as something like AWS for GPUs, addressing the unique needs of GPU over CPU computing. However, it has evolved into a concept similar to the CBOE—the Chicago Board of Options Exchange—for GPU compute. Essentially, this platform allows users to trade GPU compute power, turning it into a commodity like oil or gold. It enables companies to pre-purchase GPU resources, ensuring they have what they need when they’re ready to scale up. Data centers, too, can sell excess inventory, and the platform makes money whether there’s a shortage or a surplus.

So one of the things that excited me about this, you may remember this, but I was a freelance food and wine writer for about 10 years. During that time, a movie called Sideways came out, which made Pinot extremely popular in the United States. It was a total surprise, an outlier event and people didn’t have much Pinot to sell. All the major producers ran around trying to buy up Pinot producers and buy up vineyards, but they couldn’t get enough inventory for the demand the consumers had provided. So they started planting. Well, it takes three years before you can make wine out of grapes that you plant. So guess what happened about four or five years later – massive glut of Pinot. So you went from a shortage to a glut and it changed the market again. Pinot is still very popular.

SF Compute’s platform helps stabilize such fluctuations by balancing supply and demand for GPU compute resources and create effective pricing and in essence have a liquid market, which will be critical as AI-powered businesses scale up.

Sramana Mitra: So it functions like an exchange?

Benjamin Narasin: Yes, you can think of it as the NYSE for GPU.

Sramana Mitra: Interesting, innovative. What’s your third example?

Benjamin Narasin: The third company, Skillpost, is a marketplace for data for training LLMs. Clients include major content producers like Time Magazine and Harvard Business Review, and on the other side, AI companies like Perplexity. The legal landscape around using proprietary data for AI training is still in flux, with several companies concerned about how their content is being used without compensation. Companies like Conde Nast and the New York Times are suing people like OpenAI for using their content to train their LLMs. My view is better to have an economic model than a lawsuit.

Skillpost offers an economic model where publishers can license data to train LLMs, providing them with recurring revenue without losing ownership. Currently, they’re focused on editorial content and have signed on 50 major publishers.

So one of the reasons I’m so excited about this is that during my own journey to having a point of view here, I came to believe that data is only a temporary advantage, only a temporary moat.

In the early days, people will enter into exclusive arrangements. I have a personal investment in a company called Hippocratic that ingests the nursing notes of hospitals and doctors and trains the LLM to act like a nurse. The LLM calls you after you leave the hospital or the doctor’s office to do a follow-up to make sure things are going well. This is awesome, not just because it’s a great business and it’s enormous, but because it could, in theory, see the funds.

Sramana Mitra: And volume.

Benjamin Narasin: The idea, though, is that all the data deals they did in the early days were probably exclusive to them that gave them a head start. But over time, any provider of data is going to realize that the value of their data is far more than they can monetize on their own. If you’re going to make $10 million off of your partnership where you’ve given exclusive rights to your data, that’s great. But if you can sell the rights to use that data to train other people’s LLMs for a million dollars apiece, and you can do that a hundred times; well, ultimately, you’re going to realize that your data is a fungible commodity and a valuable one, and you’re going to rent that out to the highest bidders.

So, I believe that data will give people a head start, but I don’t think over time they’ll be able to protect whatever data they’re accessing because the economics of data will become too compelling. Look at Reddit as an example. The public stock market pricing appears to be entirely driven by the fact that they’re getting paid something like $60 million a year to allow their data to be used to train LLMs.

Skillpost’s model allows publishers to rent out data repeatedly to multiple companies, like a teacher instructing different students.

Sramana Mitra: I saw that LinkedIn has started using its data for LLM training, sparking some controversy in the community.

Benjamin Narasin: People get overly concerned. There’s no identifiable personal data involved in these training datasets. It’s like if a teacher uses your business as an example in class; there’s no harm in it. There’s no special data about individuals being preserved by the LLM. It’s a general, aggregate approach that doesn’t warrant much concern.

This segment is part 2 in the series : 1Mby1M Virtual Accelerator AI Investor Forum: With Benjamin Narasin, Founder and General Partner at Tenacity Venture Capital
1 2 3 4 5 6 7

Hacker News
() Comments

Featured Videos