Transcript: One thing that strikes me about ownership versus rental, though, is ownership tends to be some kind of capital asset, where renting is not. It's just always an expense. And that is maybe what I'm considering here. And if rental can be avoided because ownership is cheaper, then there is an obvious reason to own. And when speaking about local GPT, why pay for it, even if it is extremely cheap, on someone else's dime, when you can own it? I guess it depends on how ubiquitous it becomes and how economies of scale work. It's a strange thing, because even within the landscape of consumer technology, you don't need to own a gaming PC anymore, because Nvidia's server racks have them. So we may rent a gaming computer in a room versus having one ourselves. And that actually may be more efficient allocation of resource in some way. But it also does make me wonder about the pure economics of that situation, where a gaming PC is not a capital asset. I guess it can be used as a capital asset, but in the sense of gaming, it is not. If you were using it as a Bitcoin miner, it becomes a capital asset. But for the most part, computing technology depreciates. So a lot of it is not a capital asset to begin with. I'm still curious, though, about the thought experiment of a GPT-4 SOC. Or ASIC, rather. That is, a chip itself which implements GPT. That's all it does. It's nothing else. It just does GPT things. That's all this chip does. How widespread can that be? How little power can it use? And if it can do both of those things effectively, why do we need to reach out to a cloud service so you can get that chip for $1? $1 maybe is a bit too little, realistically. Maybe let's call it $50. But likely that cost will come down quite rapidly. Anyway, it's a strange thing, ownership versus renting. That's the main point I think I'm trying to make is I would prefer to own things than rent them. Because it gives me leverage in the world. But it is worthwhile to think about what are the kinds of things that give you leverage in the world. Obviously, the tech companies who own the IP to these large language models and the compute resources to compute them are extraordinarily valuable. I'm going to wait a few moments here. Given that the compute resources that these companies have are extremely valuable, how much less valuable do they become when that compute is commoditized? Will that compute become commoditized? Those are harder questions to answer. The bet that I have still is that that compute will become commoditized. I think it must be inevitable that that compute becomes commoditized. It may take a while, but I think it's certainly going to become commoditized. So where does that put those guys and where does that put people who bet on the commoditization of that? And how do you use that commoditization effectively? And I wonder, in some sense, if the product now is saying, Actually, you don't need to buy a subscription. Buy this piece of hardware that does this. Because this piece of hardware means that you don't have to pay the subscription. It can do all those things that paying the subscription does, but you own it. There are some bigger notions and questions about the storage and computing landscape here that I think are worthwhile to think about and try to answer. Notably, what I mentioned about the cloud earlier, that I'm not deluded into thinking I have all of the files on my local machine. I certainly do not. Most of the files do not exist on my computer. They exist on GitHub and iCloud. That's pretty much where all of my files reside. And things that used to be files, say movies and music, that used to be on my machine are also not there. I think there is a question to be asked if they should be there. And not should in the sense of, you know, whatever. Should in the sense, is it more efficient capital-wise to do so? That's a question I'm unsure about. But I believe is worthwhile to think about. So... Yeah, I... I wonder... I wonder... I wonder... I wonder... Um... It's interesting. The commoditization of personal computers allowed an entire industry to flourish. Will the commoditization... Well, there's two questions. The commoditization of large language models is happening. And then the question that I have is, well, that's happening. And also eventually they will be commoditized down to the chip level. How will this impact things? How will this impact things? These are pretty big questions that I think I need more brains to help answer.
The author is considering the dilemma between renting and buying AI hardware, particularly GPUs, for a company that requires significant compute resources to take off. Renting encourages minimal use of funds, which conflicts with the need for extensive GPU utilization to create something noteworthy. The author suggests that constantly running GPUs at full capacity for inference is a unique strategy that could provide a competitive edge by allowing real-time, high-performance applications. This approach implies a constant inference process on data, making it more accessible and valuable for sorting and classifying, a concept the author is pondering on.
The tension between ownership and renting, particularly in the context of technology and cloud computing, is a curious and complex topic that the speaker is exploring. They highlight the idea of effectively renting compute time and space through services like OpenAI and Microsoft Azure, and how reliance on the cloud has become a prevalent aspect of modern computing. The speaker also acknowledges the convenience of storing data in the cloud, even though they may own a computer. They mention OpenAI's work in this area, specifically their assistance API, and express a desire to further explore and challenge their own perspectives on these concepts.
The writer expresses enthusiasm for the potential of recent technological advancements, specifically with regard to enhancing individual engagement and benefit rather than corporate application. They believe in the potential of mobile devices to run large language models, ultimately changing how individuals interact with computers and information. They draw parallels between early computing and the current focus on corporate-oriented technology, expressing a preference for the democratization of such capabilities. The writer feels optimistic about the direction of technology and its potential for widespread value, despite current perceptions.
86.94% similar
The cost of computing power is expected to decrease, leading to increased availability. This makes the ability to utilize this computing power for extensive processing or post-processing very important, especially with evolving hardware architectures. If supported, doing massively parallel inference and leveraging large language models for parallel post-processing will likely be both feasible and significant. The trend towards more accessible compute resources will thus play a pivotal role in the advancement of post-processing capabilities and the application of large language models.
The speaker is pondering whether market efficiency could benefit from restructuring, speculating about specialization and the formation of numerous smaller, highly focused companies. They note an anecdote from Chandler about underutilization at work, suggesting that technological advances have outpaced job functions, leading to wasted potential and the possibility that better time utilization could lead to happier employees and less intense work environments. The speaker questions the need for large companies to conduct all operations internally, proposing that concentrated capital expenditures might be more effective, and uses TSMC as an example while also acknowledging the possibility that their ideas might not be valid. Overall, the speaker is curious about the potential for market disruption through a reimagined corporate structure and is seeking to view the market through this new perspective.