6 Comments

Maybe the need for human crafted tools will cease as we teach AI's to code.

So the thing about Adobe perfected editing tools, may no longer be so much of a defensible moat for Adobe's creative stack.

Expand full comment

So much excitement over toys. Toys aren’t businesses

Expand full comment
May 2, 2023·edited May 2, 2023

One additional factor that could make AI difficult from an investing standpoint is the tremendous dependencies that arise when companies build on top of foundation models from a small set of providers.

What happens to a start-up's prospects when liberal access to an API becomes more restrictive/costly?

Margins will need to be significant to ensure resiliency.

AI businesses with high volume / low margin businesses may be at risk.

Hopefully there's enough competition among foundation models to prevent them from pulling the rug on developers.

---

Separately, I'm not taking sides, just FYI, one rebuttal to your argument Charlie can be found at the bottom of this article:

https://baincapitalventures.com/insight/how-fintech-can-jump-on-the-generative-ai-bandwagon/

The part that starts here:

"One possible critique of the list above of generative AI applications within financial services is that none of the examples above are revolutionary; rather, each is an opportunity to incrementally improve the process of providing financial services. This is a feature, not a bug!..."

Expand full comment

Oooh! I might disagree.

I'm all-in on your new data + AI value proposition with Sleuth. It's our vision. But here are two examples of other use cases:

(1) There are SO MANY failed startups that tried to create no-code or low-code methods of interacting with data. It was always a joke. You can't do much better than SQL (or R or pandas or... ), because there is no shortcut.

But this year, that has changed. ChatGPT has enough situational awareness and enough smoothness that it is conceivable for someone to describe how they want datasets manipulated, and with some checking, to get their wish. It solves this massive, extraneous, previously impossible problem of filling the gaps in verbal instructions.

I can see this coming, because I already do it. I've been using ChatGPT to manipulate data-centric code. It understands my instructions and with some error correcting, produces matplotlib illustrations and - no more complex - actual machine learning solutions that take me 3x longer without it. The code it writes is sometimes BETTER than the best solutions on StackOverflow.

This isn't just a better, faster, easier thing, though. Once you can fluidly describe how you want to work with data, you enter a really different realm for working with it. "Easier" translates to very new applications to such a degree that it's like jumping from old-school hand drawn animation to Pixar. There might be actual business opportunities and disruption there for companies other than the LLM-owners you've described, because the UX is core to the solution.

(2) In the next 5 years, it is SUPER interesting to see what LLMs can do with the largest libraries of open source code. We aren't there yet - it's an extra level of complexity. But we already see potential in many fields of computer-driven design (for computer chips, for example) without Generative AI. With computer-driven review and redesign of whole libraries of code, what's possible? Code, I think, is easier than extensive prose fiction / non-fiction text or some complex images to understand. Some nuances might be lost, but the highest level of abstraction is lower because code has to follow mechanistic rules.

This does very weird and complex things to companies that are built on private solutions for open source libraries, like Red Hat and Mongo. It also creates real problems for companies that use those tools. The potential for hacking when a computer is probing for weaknesses *seems* incredible...

Expand full comment