Building AI for the Long Term
21 Jan 2026 12:30h - 13:15h
Building AI for the Long Term
Session at a glance
Summary
This discussion featured a panel on AI infrastructure and its future, moderated by Jessica Lessin with executives from BlackRock, OpenAI, CoreWeave, and G42. The panelists unanimously agreed that AI development is still in its earliest stages, with BlackRock’s Rob Goldstein suggesting “the national anthem is still happening” in terms of progress. The conversation emphasized that physical infrastructure represents the primary bottleneck, with CoreWeave’s Michael Intrator highlighting challenges in securing skilled tradespeople like electricians and plumbers needed for data center construction.
OpenAI’s CFO Sarah Friar described the company’s dramatic transformation from a one-dimensional organization to a multi-faceted platform, noting their revenue growth from $2 billion to over $20 billion ARR. She emphasized that compute capacity remains the binding constraint, with more infrastructure directly translating to faster product development and breakthrough capabilities. G42’s Peng Xiao discussed the UAE’s ambitious five-gigawatt AI campus project, positioning energy cost as eventually equivalent to intelligence cost, while praising China’s “ruthless adoption” of AI technology as a model for other nations.
The panel addressed financing innovations, with various creative structures emerging including equity partnerships, warrant arrangements, and strategic alliances between infrastructure providers and AI companies. When questioned about potential market bubbles, the panelists firmly rejected concerns about circular financing, arguing that demand for AI capabilities far exceeds current supply. They acknowledged that breakthrough efficiency improvements like DeepSeek actually accelerate rather than diminish infrastructure demand, as lower costs enable entirely new use cases and applications that weren’t previously economically viable.
Keypoints
Major Discussion Points:
– AI Infrastructure Buildout and Physical Constraints: The panel extensively discussed the massive physical infrastructure requirements for AI, including data centers, power grids, and skilled labor shortages. Michael Intrator emphasized the “physicality” of the business, noting challenges like training thousands of electricians and the need for concrete, copper, and human resources to support the AI boom.
– Financing and Capital Requirements: The conversation covered innovative financing mechanisms for AI infrastructure, including off-balance sheet deals, strategic partnerships, and new investment vehicles. Rob Goldstein highlighted this as a “generational capital opportunity,” while Sarah Friar discussed OpenAI’s creative approaches including equity rounds, warrant structures, and ecosystem partnerships.
– Global Competition and Adoption Patterns: Peng Xiao raised concerns about Western conversations becoming “self-referential” while China pursues “ruthless adoption” of AI across society. The panel discussed how different regions are approaching AI development, with China leading in usage and adoption despite potentially lagging in fundamental research.
– Cost Reduction and Efficiency Breakthroughs: The discussion addressed dramatic cost reductions in AI (from $33 to $0.09 per million tokens for ChatGPT) and the implications of efficiency breakthroughs like DeepSeek. The panel debated whether such improvements threaten the infrastructure opportunity or actually accelerate demand for compute capacity.
– Business Model Evolution and Market Maturity: Sarah Friar described OpenAI’s transformation from a “one-dimensional” company to a multifaceted platform with diverse products, partnerships, and revenue streams. The panel discussed the evolution from basic chatbots to agentic AI systems and the introduction of advertising models to democratize access.
Overall Purpose:
The discussion aimed to examine the current state and future prospects of AI infrastructure development, covering the technical, financial, and strategic challenges of scaling AI capabilities globally. The panel sought to address both the opportunities and constraints in building the physical and financial infrastructure needed to support the AI revolution.
Overall Tone:
The tone was overwhelmingly optimistic and bullish throughout, with panelists consistently emphasizing the “limitless” potential and transformative nature of AI. The conversation maintained an excited, forward-looking perspective, with speakers building on each other’s enthusiasm. Even when discussing challenges like supply constraints, geopolitical risks, or environmental concerns, the panel framed these as solvable problems rather than fundamental barriers. The tone remained collaborative and reinforcing, with minimal disagreement or skepticism expressed among the participants.
Speakers
– Jessica Lessin – Founder of The Information (moderator)
– Rob Goldstein – Chief Operating Officer of BlackRock
– Peng Xiao – Head of G42 (UAE-based AI company)
– Michael Intrator – CEO of CoreWeave
– Sarah Friar – CFO of OpenAI
– Audience – Audience member asking a question
Additional speakers:
None – all speakers in the transcript were included in the provided speakers names list.
Full session report
AI Infrastructure and Future Development: A Panel Discussion
Executive Summary
This panel discussion, moderated by Jessica Lessin of The Information, brought together senior executives from across the AI infrastructure ecosystem: Rob Goldstein (Chief Operating Officer of BlackRock), Peng Xiao (Head of G42, out of the UAE), Michael Intrator (CEO of CoreWeave), and Sarah Friar (CFO of OpenAI).
The conversation centered on the current state and future prospects of AI infrastructure development. A key theme emerged that AI development remains in its earliest stages, with BlackRock’s Rob Goldstein memorably suggesting that “the national anthem is still happening” in terms of progress. The panelists identified physical infrastructure and compute capacity as the primary constraints on AI scaling, while discussing innovative financing mechanisms and global competitive dynamics shaping the industry.
Key Discussion Points
Early Stage of AI Development
The panelists agreed that despite rapid recent progress, AI development is still in its infancy. Rob Goldstein’s baseball metaphor that “the national anthem is still happening” set the tone for viewing current AI development as pre-game rather than mid-game activity.
Sarah Friar described OpenAI’s evolution from what she called a “one-dimensional” organization focused on a single partnership with Microsoft to a multi-faceted platform. She noted OpenAI’s growth from $2 billion to over $20 billion in annual recurring revenue, while emphasizing they are still “just getting started.” She described the company’s business model evolution as like solving a “Rubik’s Cube” – moving from simple API access to complex partnerships, licensing arrangements, and diverse revenue streams.
Infrastructure as the Primary Constraint
Michael Intrator emphasized the “physicality” of the AI infrastructure business, highlighting that the industry faces traditional construction challenges including shortages of electricians, plumbers, concrete, and copper. He described walking through data centers “like the Death Star” and noted that multiple companies could each consume the world’s entire compute capacity, illustrating the scale of unmet demand.
Sarah Friar identified compute as “the binding constraint and core competitive advantage” for AI development, arguing that additional compute capacity would directly translate to faster product development and earlier breakthrough capabilities.
Peng Xiao provided concrete examples through G42’s infrastructure projects, noting they employ over 7,000 construction workers and utilize more than 100 cranes for their AI campus development. He predicted that UAE alone would consume “one billion AI agents” requiring “close to one gigawatt of AI infrastructure.”
Efficiency Improvements Drive More Demand
The panel addressed concerns that efficiency improvements might reduce infrastructure demand. Michael Intrator shared that when DeepSeek demonstrated dramatic efficiency improvements, his clients “picked up the phone, started screaming at me, get me more GPUs now,” illustrating how efficiency breakthroughs accelerate rather than reduce demand by enabling new applications.
Sarah Friar provided specific examples of cost reductions, noting that ChatGPT-4 previously cost $33 per million tokens while ChatGPT-4o Mini now costs 9 cents. Rather than reducing overall demand, these cost reductions have enabled broader access and new use cases.
Financing Innovation and Business Models
Rob Goldstein characterized AI infrastructure as a “generational capital opportunity” requiring creative financial engineering and strategic partnerships across the ecosystem.
Sarah Friar detailed OpenAI’s approach to financing, which includes traditional equity rounds, warrant structures with chip companies, and exploring value-sharing models. She mentioned that OpenAI is developing its own inference chip that “just taped out” and will focus on driving down inference costs.
When Jessica Lessin asked about competitors’ criticism of OpenAI’s move into advertising, Friar outlined specific principles: not sharing conversations with advertisers, not changing model outputs based on ads, and maintaining ad-free options for users who prefer them.
Global Competition and China’s Approach
Peng Xiao raised concerns that Western conversations are becoming “self-referential” while China pursues what he called “ruthless adoption” of AI across society. He argued that China excels at engineering optimization and practical implementation, potentially creating competitive advantages through scale and widespread adoption.
The discussion highlighted tensions between technological leadership and adoption leadership, with different regions pursuing varying strategies to position themselves in the global AI competition.
Energy and Long-term Costs
Peng Xiao shared insights from conversations with Sam Altman, noting that “in the long term, the cost of intelligence will equal the cost of energy, eventually.” This positions energy costs as a fundamental competitive factor in AI development, potentially creating structural advantages for regions with abundant, low-cost energy resources.
Environmental and Community Concerns
An audience member raised concerns about environmental sustainability, specifically regarding heat, water, energy, and carbon consumption as AI scales. The panel acknowledged these concerns but provided limited detailed strategies for addressing them.
Sarah Friar emphasized the importance of community trust and communicating AI benefits in accessible language rather than technical jargon, though the panel offered limited concrete strategies for building this trust.
Risk Factors
Michael Intrator identified geopolitical interactions as the primary risk to AI infrastructure development, citing concerns about protectionism, capital cost distortions, and market accessibility issues that could disrupt the global AI ecosystem.
The panel also acknowledged practical challenges including shortages of skilled tradespeople for data center construction and the need for community acceptance of large-scale infrastructure projects.
Conclusion
The discussion revealed broad agreement among industry leaders that AI infrastructure represents a significant investment opportunity still in its early stages. While acknowledging challenges including physical constraints, financing complexity, and geopolitical risks, the panelists maintained an optimistic outlook on continued growth in AI infrastructure demand driven by efficiency improvements that expand rather than contract market opportunities.
The conversation highlighted the evolving nature of AI business models, the critical importance of compute capacity, and the global competitive dynamics that will shape the industry’s development in the coming years.
Session transcript
Good afternoon, everyone. I’m Jessica Lessin, the founder of The Information, and I couldn’t be more excited to be here this afternoon with an all-star panel on AI infrastructure and the future of this important area. So I’ll introduce our panelists and really look forward to the conversation.
We will leave time for questions at the end, so please be saving those. To my left, we have Peng Xiao, who is the head of G42 out of the UAE and pretty much has his hands in every interesting AI project of the moment, so we’re thrilled to have him. Sarah Friar, the CFO of OpenAI, who has been doing business deals with everyone in the ecosystem and, of course, is leading one of the world’s most important AI companies.
Michael Intrator, the CEO of CoreWeave, which also needs no introduction, and Rob Goldstein, the Chief Operating Officer of BlackRock. Rob, I’m going to start with you, because I’m going to mix it up, and please, you are all experts on every topic, so feel free to chime in. Even if a question is not directed at you, Rob, just minutes ago, you were telling me that on the topic of AI and infrastructure, the future is limitless.
Let’s elaborate, and then maybe poke holes in that afterwards, but let’s start with that. What is it, when you stare at 2026, that leaves you with that feeling?
Sure. Well, what is remarkable to me, and I think people very quickly lose sight of, is that three years ago at Davos, the conversation was, you have to go check out this thing called ChatGPT, which was a brand new thing, and everyone was playing with it, like, how do I drive from X to Y?
The most basic, basic, basic things. And I think we forget, people talk about what inning this is, to use a U.S. analogy of baseball.
I think the national anthem is still happening. I don’t even think this has started yet. And if you think about the cycle as being sort of buildout, then adoption, then transformation, we’re still early in the buildout stage.
And if anything, I think there’s so much dialogue right now on bubbles, when in reality, the bigger issue, at least for the next one, two, three years, is rationing capacity. So to us at BlackRock, as the COO, when I say the opportunity is limitless, I just see if you’re someone who’s involved with operating a company, for example, we’re just starting to recognize the transformative capabilities of these technologies.
And I think that when you really think about the company, there isn’t any aspect of the company that can’t be transformed. You know, we were talking earlier about Davos and the nature of Davos and the amount of organization that’s required in terms of the many meetings that you have. Think about in a few years, everything will be automatic.
You’ll know that you bump into someone in the street. You’ll know who they are, the business that you do with them, what you should ask them. So the productivity cycle, I don’t even think has started yet.
And I think it’s limitless in terms of at least my career. The opportunity for transformation will be enough to keep me busy for the next 10, 20 years easily.
So job security.
Easily, at least for me.
So if we talk about, let’s break it down a little bit. We’re talking about infrastructure here. We’ve got to build things.
We’ve got to finance things. We’re seeing this gold rush, but there’s also been talk about, you know, power. How can we support this?
Are data centers going fast enough? Maybe to you, Michael, I mean, what are you, are there some bumps along the way in this moment of scaling? What needs to happen in 2026 to get to the next step?
And then I’ll come to you, Peng, on the same question.
So first of all, thanks for the opportunity to speak. Look, it’s such a fascinating business because the way the funnel works is like everyone sees these unbelievable products that are coming to market that have changed the world and will continue to change the world over the foreseeable future in ways that we haven’t even contemplated yet.
And that is one part of the market. Underneath that, it is physical, right? The defining characteristics of the boundary that exists right now is a physical boundary.
And those physical boundaries occur in a bumpy world, right? And it’s about power, it’s about concrete, it’s about copper, it’s about human resources. And when we think about human resources, you’re talking about the trades.
You need plumbers, you need electricians, they need to be trained. Four years ago, we were building a data center. You had 100 electricians, 80 of them were experts, and 20 of them were beginning their career.
And now you have 2,000 electricians and 80 of them are experts and everyone else is beginning their career. There’s a physicality to the business right now that the world is struggling to translate into valuation, into public markets, into the path to delivering these products that are generationally, and I don’t even mean like one generation, this in my mind goes down as a stepping, you know, the wheel, intelligence, sort of like that scale of change that’s occurring and behind the scenes of that it’s so, you know, like to go to one of these data centers to walk into like what is functionally the Death Star and you walk through these things and you’re just like, wow, there’s a lot of building, there’s a lot of cables, there’s a lot of optics, there’s an incredible amount of carpenters that are working here.
It’s wild. Right?
Peng, how do you feel about sort of the pace of these projects and are they at the pace that we need to be meeting the demand? And how, and then across border as well, what is the state on that?
Well, let me first say that the top expert here in building is Mike, who just spoke to you on the subject very eloquently. We do have a G40 to a branch of our business also focused on building what we call token factories. This is basically core and shell and also GPU management business.
To put this in context to answer your question, right now, some of you already know this, Abu Dhabi is undertaking the build out of a five gigawatt AI campus, which was announced last May when President Trump visited Abu Dhabi.
I have the simpler view, again, because I’m not a technical expert in building centres. My view, which I discussed with Sam multiple times, and you also, sorry, is eventually In the long term, the cost of intelligence will equal the cost of energy, eventually. This is a unique advantage why we are drawn, G42 and UAE Abu Dhabi, drawn into this business because we have a national advantage in energy production.
To have five gigawatt on the grid and no permitting issues to be able to build, it’s a blessing. And this is why we’re working with OpenAI, Microsoft, many others to provide capacity to them. As we speak right now, there are over 7,000 construction workers in the desert with over 100 cranes, building about 250 megawatt per quarter to deliver on this five gigawatt project plan we have.
So we’re obviously very bullish. And we believe UAE will consume a big chunk of that, but we also will be able to export those tokens as packaged intelligence to the rest of the world. So in UAE, to give you perspective, you mentioned that I seem to be doing everything.
Because we are serving as a fairly large company in a fairly small country, we’re serving the entire society using AI. And one of the tasks we have, KPIs this year, is to produce over one billion AI agents to boost our GDP. These agents range from everything on coding agents, to petroleum engineers, to cybersecurity analysts.
So we did our calculation. If we actually can deliver one billion agents, AI agents, by the end of this year, and they are actually working even just 12 hours, because they can work nonstop, just 12 hours a day, they’ll be consuming probably close to one gigawatt of AI infrastructure.
This is how bullish we are, and why we’re building this infrastructure in UAE.
So Sarah, you need to buy a lot of this capacity and have done so through just a number of different partnerships. I think it’s the one-year anniversary of Stargate. It is, yeah.
Today, which, how’s that project going?
So incredibly, this time last year we announced Stargate to the world and then very quickly actually had our president alongside Masa San, Sam, and Chuck Robbins actually from Cisco, and Larry Ellison from Oracle, stand up in the Roosevelt Room and talk about the importance of that build.
And what’s wild today, our Oracle campus, for example, we said at the time we’d do upwards of $500 billion and we’re already well over halfway in terms of getting that built up. We’re actually training models in that Oracle campus today on the latest chips. So it’s gone kind of, I think, better than we all dreamed a year ago.
It’s come almost a pinch-me moment. I think more broadly, like to kind of pull the threads together, a year ago or a year and a half ago, OpenAI felt like a little bit one-dimensional. We worked with one CSP, Technic or Weave under Microsoft, one chip provider, NVIDIA.
We had one product, ChatGPT, and one business model, a subscription for $20. Today, if I think of it as like a corner of a Rubik’s Cube almost, today we have a base of infra that is almost every CSP you can mention. We’re diversifying our chip portfolio, including our own inference chip that just taped out.
On the product side, we’ve gone from a consumer product that was really a chatbot to now be a task worker that can do things like healthcare for you. On the enterprise side, something that can go from a simple ChatGPT wall-to-wall deployment to APIs to now agentic behavior throughout the enterprise, we believe we’re there. And then an API platform.
And of course, Sora, because now we have multimodal. So the product platform is multidimensional. And then finally, the business model is becoming multidimensional, from simple subscription, through SaaS-based pricing, through enterprise licenses, through credit-based pricing, through commerce into advertising.
And then ultimately, I think we can do some real value sharing. For example, in drug discovery, what if we took a license down the line to the drug that is discovered and used that as a way to pay for it? So if I think of that Rubik’s Cube analogy, now you spin it, and you say, okay, I’m gonna take a low-latency chip, like Cerebris, that we announced a few weeks ago, I’m gonna create a high-end, really fast coding SKU, literally best in the world, and I’m gonna have a high-end subscription price for it.
It’s like I’ve turned all the yellows, I’ve created a block. And that brings you back to this conversation about build. We are just getting started, right?
You used baseball, since I’m a kind of American, but a Brit under the surface. I’ll go to electricity. I’m not gonna do a sports analogy.
But I think we’ve wired the house and turned on the lights, but we have not explained to people that now they could heat the home, they could cook in that home using electricity, they can have entertainment.
And so the capability overhead is massive. Even if models improved zero from today, there is still so much productivity to be had just with what’s in people’s hands.
I think OpenAI has also said you’re building your own data centers. Is that part of the mix in terms of your infrastructure footprint?
Today, I’d say we’re on a journey. We really are utilizing our CSP partners because it’s a way to stay lighter weight on the balance sheet, and frankly, to be able to work with folks, like for example, when we work with G42 and UAE, they have the local expertise.
They know how to do the land power shell. When we get inside the data center, as Mike knows, we will tend to bring to bear a lot of a very strong point of view. on what the kit and the rack and so on looks like sitting in that data center.
We have a whole scaling team whose job is to make sure that frontier models can get trained on that large fabric and therefore what does that fabric need to look like? What is the chipset? What’s the cooling?
What’s the power? How is it just connected? And so that’s a place where we’ve created a lot of, I would say, our own IP.
So we’re on a journey. We made an announcement with SoftBank Energy to do our first colo kind of build-to-suit. So think of that more as the next step.
We’re not all the way to build our own today because frankly, we have great partners to work with. But, you know, where will we be in three years? I think you said it well when just three years ago child TPT was just a new thing.
So we’re still young in our journey.
But your new chip is being tipped out already. That’s impressive progress. That’s the heart of data center.
Yeah, that new chip has, you know, it’ll be an inference-specific chip and it’s all about how do we keep driving down cost on the inference side, right? I find it amazing that from chat GPT-4 where it cost about $33 per token, per million tokens, today chat GPT-5 mini costs nine cents. Like $33 to nine cents is a 99% reduction in costs in about two, a little over two years.
That kind of blows my mind back to the point about just getting started because now you’re creating a cost backdrop that actually allows you to give access to everyone.
The other thing that it does is, and it doesn’t do it yet because we are still in a market that’s pinned, right? Like OpenAI can consume the capacity of the world. There’s five other models that can consume the capacity of the world, which gets to part of what you were saying.
But the drop in cost being as precipitous as it is, how many more ideas are going to get an opportunity to… come into existence and that’s what I mean by like we don’t, collectively, everyone on the stage, everyone in this room, I would argue, we don’t know what our clients or who our clients are yet because there is a, it is a natural, you know, like supply and demand don’t work yet, right, because the market is literally pinned to the red line and so it’s always up against the top but as we begin to see, and markets are good at this and so, you know, like they will ultimately bring supply and demand into some sort of balance which will allow for the creation of tons of new things that we just don’t know about yet and that’s really exciting also, matter of fact, it’s probably the most exciting part of it for me because I just think it’s, you know, just as people conceived of what exists right now that blows us away, they’ll have an opportunity to do it again in the future.
The demand is still very much in the early discovery stage.
Yeah.
And I think that one of the things that often gets overlooked, going back to the reinforcing nature of creating this intelligence, the number of lines of code in the world is going to increase exponentially.
We were talking about this yesterday and just from a BlackRock perspective, our ability to leverage our engineers to just be exponentially more productive, the net result of that is we need more compute for like the old stuff.
Yeah. So, I think that we are going to have a phase where just the old cloud requirements are going to grow even faster than people expect. In addition to this, and then you add on it, it’s very clear almost all the software in the world in the next 20 years, you can figure out if it’s in the next two or the next 18, almost all the software in the world is going to be interchangeable.
with the AI capabilities that we’re describing. So everything will be inference in terms of how it actually operates. So I think the nature of the technology and the innovation it enables winds up being self-reinforcing in a way that’s only gonna accelerate.
I had the opportunity to interview Andy Jassy yesterday and he was, that’s the AWS argument about the growth in cloud. He called it the middle of the barbell right now. Staying on chips for a second, he also talked about Tranium, which he also believes will be necessary to get the costs down as well as the individual efforts.
I wonder, in the infrastructure stack, how much change do you think we’ll see on the chip side of it? Obviously NVIDIA is the leader, but Michael, maybe to you, I know you’re, I think you’re all NVIDIA maybe at CoreWeave right now, or do you see a world in which you’re working with other chip providers?
So the way that we’ve built our business is we are led by clients and the best solution in market is and has been the NVIDIA technology set. And we can’t keep up with our demand for NVIDIA. It’s hard to allocate resources to, for a company that’s growing as fast as we are, and we are, with the exception of everyone else on this stage, unique, right, in terms of our growth prospects.
And that is a, that is said with the, it’s an incredible set of companies that are doing incredible things and going through a growth profile that is, you know, at any other point in my career would have looked at and would have like caused my head to break.
So. Look, you know, our belief and the way that we have constructed the company and our North Star is take the best computing infrastructure, build the best software solution to be able to orchestrate and deliver it, which will lead to the best product to enable the most companies to be as productive as they can in bringing their products to market.
You know, from where I sit, there’s going to be lots of different alternatives. There’s a lot of different ways to define best, whether it’s cost, whether it’s capacity, whether it’s what you can do with it, flexibility. There’s, you know, like the concept of best is not that simple in this space as we look forward in time and think about the different ways that compute is going to be consumed.
So from my seat and from the business that we are building and the scaling that we are going through, we think that the world will have lots of alternatives, but the best infrastructure out there is going to be built by NVIDIA.
It’s going to be built on a platform that is like ours, if not ours, and it’s going to have a software layer that’s able to deliver it to the market in its most performant configuration. And until the market explains to me otherwise, and markets have a good way of doing that, that’s our plan, that’s our strategy, and we’re going to keep our head down and keep hammering.
So another key component to all of this is capital, of course, and I think last year brought a variety of new flavors of financing deals from off-balance sheet deals to all sorts of new partnerships. I mean, maybe, Rob, from the investor side, are you seeing, I don’t want to call them novel because they’re not novel in the history of finance, but what are you seeing in terms of vehicles for financing data centers that’s catching your eye?
Well, I think this is a generational capital opportunity. where if you think about the numbers we’re talking about, and if you think about, you know, we have this term within BlackRock, the fast river, and you want to put your boat in the fast river to have it help you along the way, and this is one of the fastest rivers I think any of us will experience in our careers.
I think we’re seeing creativity, as there always is in the financial engineering side, but at the same time, I think we’re seeing more and more people looking to create the ecosystem in terms of not only having it as capital partners, but having it as true strategic partners.
And if you go back, for example, to BlackRock and some of the things we’ve done, we’ve actually looked to create investment vehicles that bring together as GPs, people like MGX, so leveraging the incredible innovation of Abu Dhabi, people like Microsoft, people like Nvidia, and bringing that ecosystem together so that way you’re able to prioritize focus in terms of actually bringing these projects to completion.
Sarah, what are you seeing and how do you think about this?
Yeah, so I mean, I think it starts with compute. So compute is the defining characteristic of what we need to supply the demand. It is the binding constraint today.
And I think it is a core competitive advantage because there’s not enough of it. And so if you don’t have it, you get slowed down. And we have faced that a lot in the past year.
People always ask like, well, what we know if you’d had more compute, what would have happened? I’m like, it’s quite simple. We have more compute, we’d have more products, we’d have more revenue.
We would literally have had frontier models pulled in by six months, 12 months, 18 months, because we gotta know what we wanna do. And it’s not just revenue. I’m a CFO, I care about our business model.
I’m gonna come back to it in a second. But we’re talking about breakthroughs in areas like healthcare. And if I’m a cancer patient who with that breakthrough actually gets my life saved.
In some ways, speed is even more of the essence right now. So I think we’re holding stuff back that just would help the world writ large. That said, from a CFO sitting in my seat who now has to work through how to pay for that, I mean, there’s nothing beats a good business model.
First and foremost, cash flow is king. And so what we have seen is as we have invested in compute, our revenue has kind of risen to meet the challenge, right? I use, often in these rooms, talk more about ARR because it’s the true load of the business.
It’s the compute I need. It’s the business I need to sustain it. So ARR has gone from $2 billion to $6 billion to over $20 billion just in the last, so in that moment where Chats GPT burst on the scene.
There’s never been a company like it, right? To Mike’s point, I was a research analyst at Goldman Sachs for over a decade of my career. If someone had showed me that model, I would have said, that’s crazy.
Like, you’re wrong. And we see just this continued momentum as we hit 2026, by the way, it’s not stopping. Our consumer business is hitting daily highs, which is super fun to see.
And our enterprise business is kind of shooting through the roof, right? Jessica’s not wrong. Davos is just an incredible moment to go talk to CEOs and folks who are making these business decisions, right?
I was saying to her as I hit the floor, I was just with the CEO of a large, like if I said it, like a world-famous bank who, you know, is feeling a little bit behind, had just spent time yesterday with Carlos Torres, who’s the chairman of BBVA.
BBVA wants to be an AI native bank. They’ve gone from 10,000 seats deployed to over 120,000 seats deployed. They’re rethinking their call centers.
They’re rethinking their credit, how they do credit decisioning. They’re in 25 countries, so they have to do multilingual. I mean, totally new way of thinking about enterprise growth.
To finance it, we are trying to be creative. Both, we’ve done equity-based financing. We did the largest equity round ever, $41 billion last year.
We have done unique partnerships. Our AMD warrant structure, I’m really proud of because it’s a great alignment of incentives. If we buy those AMD chips, we could get all the way up to owning almost 11% of the company.
And we think that if we do buy all of those chips, we really help create a lot of market cap for Lisa and team. And they deserve it. They’re an incredible company as well.
We have talked about our partnership that’s brewing with NVIDIA to fund us as we buy gigawatts of their chips as well. And so what we see is the ecosystem kind of rising to work with us, and that includes private equity shops as well, have been very, very interested in how can we deploy into their portfolio companies and with that create a business model around it.
So we are just getting started. But I do think it’s important that the ecosystem rise together, otherwise, we’re just going to go slower. And back to where I started, that means that that patient, that child in school or whatever is not going to get access to intelligence.
There’s a habit in my industry, and I’m not saying I believe this, but I’m interested in your reaction to it, to calling these circular financing deals in the sense that the sense is that all these deals basically amount to some amount of risk in the system.
Sarah, how do you think about that?
I mean, there’s an implication behind it, though, that the demand is not real somehow. And I think, I mean, we can all just start at the beginning of the panel again. I don’t think that we can deny that the demand isn’t here.
And I think in particular, people are still have, like a lot of the folks who are working on Wall Street or working in your world, Jessica, lived the bubble bursting before in the internet generation. And that’s because when I think about what the internet was like at the time, the point of email, if there were only like three people on the thing, right, you still had to send letters in the mail. I would say to all of you right now, if I went around the room, is ChatGPT not making a difference in your life?
Like, do you not immediately see value in it? And again, like for our frontier users, they’re using effectively, if you look at tokens as a measure of intelligence usage, they’re at 7x what just the average user today is using because they’re coding, they’re doing deep research, they’re in a university lab.
Countries, we’ve looked at it on a country level. The frontier countries are using 3X the amount of tokens already. And so that’s where I think the idea of kind of circularity and so on, it’s that implicit like, oh, the demand’s not there, they’re all just like trying to shore it up, that I completely refute.
And so instead, I just view it as the ecosystem, you know, the folks on this DS right now, right? We all feel like we’re short the thing. And so if we can push because Abu Dhabi and Peng are the most AGI-pilled of them all, or have the power.
And Mike over here knows exactly how to build a data center and to really think about the tech behind it. And BlackRock is doing incredible things to create financing vehicles, and actually has gotten very deep in data center technology itself, right? Those are all good things because everyone is bringing their expertise to the table.
I’ll add one more thing too, if you don’t mind, maybe zooming out even further. And Robin and I had a discussion about this during lunch, which is I’m a bit worried about many of our conversations here in the West becoming self-referential. I want to point out another country called China.
They may not have the best models in the world, the best computing in the world, but they’re doing what you termed as ruthless adoption. They are driving use cases in every aspect of the society. In fact, they’re a model for many of us to emulate.
They are charging ahead to use AI. And guess what? Being a user, influencing traffic, getting the intelligence out, can circle back to create better models.
Test, time, and compute, yeah.
Exactly. So study the Chinese usage adoption of AI technology. It’s a lesson for all governments.
Sarah, I want to ask one more follow-up on the business model question you talked about. Some of your competitors here at Davos have raised questions about why you guys are going to advertising so soon. I don’t know if you saw those tweets earlier today, but you’ve talked about your plans too and outlined your principles, but I don’t know if you have a reaction to them saying, oh, it’s a little bit early, they must need the money.
How do you think about the ad opportunity?
So, I mean, first and foremost, like the why, it goes back to my Rubik’s Cube. Remember at the top layer, I’m trying to create as much optionality as possible. What I know today is that 95% of our users for ChatGPT are free users because our mission is AGI for the benefit of humanity, not for the benefit of humanity who can pay.
And so in order to create access, I have to create a really strong business model. Early is a weird word because in ad models, you have to be at scale. Subscale ad models don’t work.
So that would be early. But when you have 800 million weekly active users, you’re already far beyond the scale of many companies who started in that model. But I think we have to be principled.
Number one, we are not gonna change the output of a model based on advertising. We have to make sure users remain fully trusting us that you always get the best answer, not the benefit of humanity and an advertiser who sponsored them, right? Second, we are going to be very careful about not sharing your conversations to advertisers nor selling your data.
And number three, we always wanna make sure there’s ad-free on-ramps, if that is your preference. But for many folks, and I particularly find it at Davos because you are talking about a globe where what I love about technology, it is the ultimate democratizer because I’m from Northern Ireland. I already said this once.
If you take a farmer in Northern Ireland, they can literally have the same phone with the same chat GPT level of intelligence as Bill Gates or Elon Musk or pick the richest man in the world at the moment, hopefully a woman in the world soon too.
Literally the same. They can’t drive the same car probably. They can’t have the same size of house.
They don’t go on the same fancy vacations, but they can literally have the same technology at their fingertips. But it is our mission to make sure that that can be the case. And so we want to be able to pull all levers along the way.
We know our users are using ChatGPT, not just for things like health, but also for commerce. It’s very natural to come down that funnel because you’ve had a very deep conversation. You might have started by saying, Hey, I’m expecting a new baby.
I really need a new baby stroller. What are the best out there in the market? But here’s my price point.
With memory, ChatGPT actually really understands a lot of this already. It might give you some interesting other ways to think about that purchase. And then what we hear our users saying is, Help me just consummate it.
And that’s where a very interesting conversation with an advertiser can begin. But I think appropriately so because you’re adding value to the end user. And I think that, again, we have to remember our North Stars.
That’s actually why we started by publishing our principles first before we even started testing.
I’m going to go to questions in a moment. But, Peng, you brought up China. And it reminded me of a conversation I had 15 minutes before I walked into this room.
I was talking to an entrepreneur who’s, I don’t want to give too much away, but building an AI company that’s not based in the U.S. And I said, Where do you train your models? And this person says, It doesn’t matter because I can do it so much more efficiently than everyone else.
And I didn’t have time to grill this person on what they were using or how they got it, per se. But it did make me wonder, I guess, but specifically, has China figured out something around training from a sort of research side to fundamentally change this equation? I know this came up in DeepSeek.
There was a little speculation. And then more broadly, if all of a sudden breakthroughs in AI itself lead to training models much, much more efficiently, is the infrastructure opportunity limitless? Or do we kind of hack that somehow on the model side?
Maybe Michael, maybe Peng.
I’ll comment on the first. I’m not an expert in this area. But I can tell you from my personal point of view, China is very good at engineering.
I think they’re superb in squeezing every bit of engineering advantage all of the infrastructure they do have. But I believe they are certainly behind on fundamental research and model breakthrough. That’s my view right now.
This is why earlier I mentioned where China leads today, in my opinion, is adoption. It’s in fact ruthless adoption, as Rob said earlier. This is where they potentially can gain an edge.
If we’re still debating, can we use it, should we use it, how do we use it, will the other nation or nations that are moving ahead to adopt it will be falling behind, even with the best model in the world.
And in terms of shortcuts to AI infrastructure consumption, I think when you look at China’s usage today, it’s really actually consuming more inferencing capacity. It actually needs more capacity to meet the demand of billions of people. Training, I think they are limited.
But on inferencing, I believe, globally, there will be a lot more demand.
I want to reframe the question. Sure. I think that it is important that we all take as a basic truth that there will be order of magnitude step functions and improvement and efficiency within this technology within the next five years.
Pick your flavor. Like the DeepSeek step function shocked everybody except anybody who even remotely touches AI. This technology, the way that we’re using it, what we’re doing, when DeepSeek came out, there’s two things that matter about when DeepSeek came out.
It was, what, two years? after Chad GPT-3?
Yeah, that’s right.
So like, the world kind of picked its head up and said, oh, AI, I get it, at Chad GPT-3. And two years later, like the technology is going through this incredible ramp of maturation. You have to work from the assumption that we will make step functions on the physical infrastructure, we will make step functions on the logical infrastructure, there are going to be unbelievable breakthroughs, and the whole world trembled when that happened.
Certainly the financial world, right? The other thing that mattered is every single one of my clients picked up the phone, started screaming at me, get me more GPUs now. I need more tokens, I need more tokens, get me more GPUs now, right?
And so that’s a indication from where the rubber hits the road, right? Where people are building products, where people are serving inference, where people are monetizing AI. That is telling you that that was an acceleration of the business, not a fundamental change of the business.
The improvements that were exhibited in the technology from DeepSeek were impressive. Full stop. But we also took tokens from $30 down to $0.09, okay?
So there’s a countervailing component of this, which is, okay, hit me with another DeepSeek, let’s take the $0.09 down to $0.0001, and tell me what gets built. The world is going to absorb the tokens. The world is going to consume…
It’s not the infrastructure that’s endless, it’s the voracious appetite for intelligence that is limitless. That’s the part that we have to focus on. And every single step function in the foreseeable future, certainly within the horizons we’re working on…
are going to do nothing more than accelerate the business, right? And that’s the way I look at it. That’s the way I’ve positioned myself to build my company as I look forward in time.
And I think, you know, whether or not people are working through it to the same conclusion, if you’re in this business, you have somehow, someway gotten to that point where you’re saying, yeah, intelligence is a net positive.
It will continue to be a net positive until human extinction.
I mean, could I just, though, just really fast on the training? Because I think we’re oversimplifying as well. Remember, what we know today, because again, three, four years ago, but there’s pre-training, which still needs large compute fabrics, lots of data and incredible algorithms coming from the smartest minds on the planet.
But then post-training is where we got to the reasoning paradigm. And our research should tell you post-training, so kind of what we call our O-series of models, is still probably more akin to chat GPT-3. The GPT model, which is the large, is already we’re at five.
But O, which is then when we do the post-training, is still, we have a lot that we can still do there. And then there’s exactly what Peng said, test-time compute. So at the moment where you’re making a call, there actually can be real training done on the inference side, like maybe even down at a device level.
And it’s combatorial. It’s not like one plus one plus one is three. It’s combatorial.
And so again, I think there’s a reductionism that the world likes to use. Markets love to use it, right? A single thing.
It’s also complicated to understand, but yes. Totally. But it’s much more rich and diverse than I think the market often gives it credit for.
And that’s why I think you get these wild swing moments, like DeepSeek, or before it was actually the laws of scaling are dead. Then there was the DeepSeek moment. And so market loves these moments.
But I would say if you look back at a stock chart or a market chart, it’s like it looks like the biggest divot of all time. And in reality, what we’ve just seen is this ongoing kind of motion for more.
Catalytic. I think if you zoom out… What you have to believe is that nothing is more positive for economic growth than productivity step functions.
And if you believe, which I strongly do, we strongly do, that AI presents a generational opportunity for productivity step function, whatever accelerates that is positive overall, even if in the short term there’s a little bit of disruption.
So I’m going to take the last question and, oh, sorry, I’m going to ask one more and we’ll do it fast because we have to end on time because some people have some places to go. So what worries you all? This is a very rosy, I mean, it’s a very exciting picture, I think, but what could go wrong or what could throw a bigger speed bump in the way?
What we’re doing exists in a broader backdrop and there are geopolitical, there are, you know, like, there are a lot of things that occur outside of our important but narrow, or at least narrow and expanding, let’s put it that way, line of business.
So what worries me? There’s, you know, geopolitical interactions that can distort the market, they can, you know, in terms of protectionism, they can distort the market in terms of cost of capital, they can distort the market in terms of accessibility, it can do all kinds of wild…
generations. And if I were to put on the top of my list what I think about from a how I manage risk at our company, which I spend an enormous amount of time on, that’s got to be first and foremost.
Okay.
I jumped on that because I knew everybody else was going to say it.
A very quick question because I want to make sure to get to one and we’re going to end right on time.
Okay. So AI is scaling much faster than planetary systems can regenerate. So my question is why will heat first regulation or water, energy and carbon?
And who should be responsible for that?
So I’ll start, but the builders of the underlying should also definitely chime in. So I was going to, to answer both your questions, I was going to say speed and the ability to bring people along so that they trust, right? If we all, like we live it, we breathe it every single day, but if you are just a kind of normal person living your life out there in the world, this can all feel like kind of tech gobbledygook is one of my favorite words.
And then there’s a lot of trust that still has to be earned. And I do worry that we talk in tech talk, not in like real people speak, like what does this actually mean for me? If I’m a mom of a diabetic kid, I work hard all day, I come home, I’ve got 30 minutes to cook dinner and it needs to be healthy for my child, how are you going to help me?
And the answer has to be, actually, I’ve got this thing on your phone called Chet-GPT, you could take a photograph of your fridge right now, we’ll give you a recipe, you can cook in 30 minutes or less, that is exactly what you can, your kid can have.
Like you’re like, hallelujah, thank you. And so then to come to your question, I think we also have to build trust in communities. Communities who are going to have a data center springing up, is that a good thing or a bad thing?
It’s going to bring me jobs or is it going to ruin my community? Is it going to use my water or are they going to be thoughtful about it? Is it going to raise my electricity prices or are they going to help protect my community?
Because I still have to put bread on the table for my child. And so I think we all have a real onus up here. to be bringing along communities and talking in their language, not at them, but with them, taking their feedback and really thinking about how we put that into how we build our businesses because we can get a little bit in our ivory tower.
I’m going to end it there for time. Thank you all for the fascinating conversation. Good to see you.
Rob Goldstein
Speech speed
146 words per minute
Speech length
774 words
Speech time
317 seconds
AI transformation is still in very early stages, with massive productivity opportunities ahead – limitless potential for next 10-20 years
Explanation
Goldstein argues that AI development is so early that “the national anthem is still happening” and we haven’t even started the game yet. He believes we’re still in the early buildout stage of a cycle that includes buildout, adoption, and transformation, with the productivity cycle not yet begun.
Evidence
Three years ago at Davos, ChatGPT was brand new and people were just experimenting with basic functions like directions. He envisions a future where everything will be automatic – you’ll know who someone is when you meet them and what business you do with them.
Major discussion point
AI Infrastructure Development and Future Potential
Topics
Infrastructure | Economic
Agreed with
– Sarah Friar
Agreed on
AI development is still in very early stages with massive growth potential ahead
Generational capital opportunity requiring creative financial engineering and strategic partnerships bringing together ecosystem players
Explanation
Goldstein views the AI infrastructure buildout as a massive capital opportunity that requires putting resources in “the fast river” of technological change. He emphasizes the need for strategic partnerships that combine capital with operational expertise.
Evidence
BlackRock has created investment vehicles that bring together as GPs entities like MGX (leveraging Abu Dhabi’s innovation), Microsoft, and Nvidia to create an ecosystem that can prioritize and complete projects.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Infrastructure
Infrastructure investment represents putting resources in “the fast river” of technological change
Explanation
Goldstein uses the metaphor of a “fast river” to describe how investors should position themselves to benefit from the rapid pace of AI development. He sees this as one of the fastest-moving opportunities any of them will experience in their careers.
Evidence
BlackRock’s strategy of creating ecosystems with strategic partners rather than just providing capital, focusing on bringing projects to completion.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Infrastructure
Productivity step functions from AI acceleration are fundamentally positive for economic growth despite short-term disruptions
Explanation
Goldstein argues that anything that accelerates AI adoption is ultimately positive for economic growth because AI represents a generational productivity opportunity. Even if there are short-term market disruptions, the long-term economic benefits outweigh the costs.
Evidence
He references the reinforcing nature of AI development, where increased productivity leads to more lines of code and greater compute requirements, creating a self-reinforcing cycle of growth.
Major discussion point
Technology Evolution and Competitive Landscape
Topics
Economic | Development
Michael Intrator
Speech speed
143 words per minute
Speech length
1524 words
Speech time
637 seconds
Physical infrastructure constraints create bottlenecks – need for skilled trades workers, power, concrete, copper in massive scale buildouts
Explanation
Intrator emphasizes that despite the incredible AI products being developed, the fundamental constraint is physical infrastructure requiring traditional construction materials and skilled labor. The scaling challenge involves training massive numbers of electricians and other tradespeople.
Evidence
Four years ago, a data center project had 100 electricians with 80 experts and 20 beginners. Now projects have 2,000 electricians with only 80 experts and the rest being beginners. He describes visiting data centers as like walking into “the Death Star” with incredible amounts of building, cables, and carpenters working.
Major discussion point
AI Infrastructure Development and Future Potential
Topics
Infrastructure | Development
Agreed with
– Peng Xiao
Agreed on
Physical infrastructure constraints are major bottlenecks requiring massive buildouts
Market is supply-constrained with demand exceeding global capacity – multiple companies could consume all available compute
Explanation
Intrator explains that the AI compute market is “pinned to the red line” where supply and demand don’t work normally because demand far exceeds available capacity. Multiple major AI companies could each consume the world’s entire compute capacity.
Evidence
OpenAI alone could consume the world’s compute capacity, and there are five other models that could do the same. The market is literally pinned against maximum capacity constraints.
Major discussion point
Compute Capacity and Demand Dynamics
Topics
Infrastructure | Economic
Agreed with
– Sarah Friar
Agreed on
Demand for AI compute capacity far exceeds current supply constraints
Inevitable order-of-magnitude efficiency improvements coming within 5 years – step functions will accelerate rather than replace the business
Explanation
Intrator argues that major technological breakthroughs like DeepSeek should be expected and will accelerate AI adoption rather than fundamentally change the business model. He believes the appetite for intelligence is limitless and will absorb any efficiency gains.
Evidence
When DeepSeek was released, all of his clients immediately called demanding more GPUs and tokens, indicating that efficiency improvements drive increased demand rather than reduced infrastructure needs. Cost reductions from $30 to $0.09 per token demonstrate the pattern.
Major discussion point
Technology Evolution and Competitive Landscape
Topics
Infrastructure | Economic
Agreed with
– Sarah Friar
Agreed on
Efficiency improvements will accelerate rather than reduce infrastructure demand
Geopolitical interactions pose biggest risks through protectionism, capital cost distortions, and market accessibility issues
Explanation
Intrator identifies geopolitical factors as the primary risk to AI infrastructure development, as they can distort markets through protectionist policies, affect cost of capital, and limit accessibility to resources and markets.
Evidence
He spends enormous amounts of time on risk management at his company, with geopolitical interactions being first and foremost on his list of concerns for how they can distort the market.
Major discussion point
Risks and Challenges
Topics
Legal and regulatory | Economic
Peng Xiao
Speech speed
150 words per minute
Speech length
777 words
Speech time
310 seconds
UAE building 5 gigawatt AI campus with over 7,000 construction workers and 100 cranes, leveraging national energy advantage
Explanation
Peng describes the UAE’s massive AI infrastructure project announced during President Trump’s visit, emphasizing their competitive advantage in energy production. He believes that long-term, the cost of intelligence will equal the cost of energy, making energy-rich nations like UAE naturally advantaged.
Evidence
Currently over 7,000 construction workers in the desert with over 100 cranes are building about 250 megawatt per quarter to deliver on the five gigawatt project. UAE has five gigawatt on the grid with no permitting issues, which is a significant advantage.
Major discussion point
AI Infrastructure Development and Future Potential
Topics
Infrastructure | Development
Agreed with
– Michael Intrator
Agreed on
Physical infrastructure constraints are major bottlenecks requiring massive buildouts
China leads in “ruthless adoption” of AI across society, creating demand for inference capacity despite model limitations
Explanation
Peng argues that while China may not have the best AI models or computing infrastructure, they excel at widespread adoption of AI technology across all aspects of society. This aggressive adoption creates a feedback loop that can improve models through usage data.
Evidence
China is driving use cases in every aspect of society and charging ahead with AI usage. Their approach of being users and generating traffic can circle back to create better models through the data generated.
Major discussion point
Compute Capacity and Demand Dynamics
Topics
Sociocultural | Development
China excels at engineering optimization but lags in fundamental research – adoption leadership could create competitive advantages
Explanation
Peng believes China is superb at engineering and squeezing every bit of advantage from existing infrastructure, but they lag behind in fundamental research and model breakthroughs. However, their lead in adoption could provide strategic advantages.
Evidence
China’s strength is in engineering optimization and ruthless adoption across society, while they are limited in training capabilities but have high demand for inference capacity to serve billions of users.
Major discussion point
Technology Evolution and Competitive Landscape
Topics
Infrastructure | Development
Disagreed with
– Sarah Friar
Disagreed on
China’s competitive position in AI development
Western conversations becoming too self-referential while missing lessons from global AI adoption patterns
Explanation
Peng warns that Western discussions about AI are becoming insular and self-referential, missing important lessons from how other countries like China are implementing AI technology. He suggests studying Chinese adoption patterns as a model for governments.
Evidence
He specifically mentions that conversations in the West are becoming self-referential and points to China as a model for ruthless adoption that other governments should study and emulate.
Major discussion point
Risks and Challenges
Topics
Sociocultural | Development
Sarah Friar
Speech speed
186 words per minute
Speech length
3193 words
Speech time
1026 seconds
OpenAI has diversified from single partnerships to multi-dimensional infrastructure, products, and business models – still just getting started
Explanation
Friar describes OpenAI’s transformation from a one-dimensional company with single partnerships to a multi-faceted platform with diverse infrastructure, products, and business models. She uses a Rubik’s Cube analogy to show how different components can be combined for various solutions.
Evidence
A year ago OpenAI worked with one CSP (Microsoft), one chip provider (NVIDIA), had one product (ChatGPT), and one business model ($20 subscription). Now they work with almost every CSP, are diversifying chips including their own inference chip, have multiple products from consumer to enterprise to APIs, and multiple business models from subscriptions to enterprise licenses to commerce.
Major discussion point
AI Infrastructure Development and Future Potential
Topics
Infrastructure | Economic
Agreed with
– Rob Goldstein
Agreed on
AI development is still in very early stages with massive growth potential ahead
Compute is the binding constraint and core competitive advantage – more compute would enable faster product development and breakthroughs
Explanation
Friar argues that compute capacity is the primary limiting factor for AI development and represents a core competitive advantage. She states that with more compute, OpenAI would have more products, more revenue, and could have pulled frontier models forward by 6-18 months.
Evidence
She directly states that if OpenAI had more compute, they would have more products and revenue, and frontier models would have been delivered 6-18 months earlier. This impacts not just business but potential breakthroughs in areas like healthcare that could save lives.
Major discussion point
Compute Capacity and Demand Dynamics
Topics
Infrastructure | Economic
Agreed with
– Michael Intrator
Agreed on
Demand for AI compute capacity far exceeds current supply constraints
Cost reductions are dramatic (ChatGPT-4 from $33 to 9 cents per million tokens) enabling broader access and new use cases
Explanation
Friar highlights the remarkable 99% cost reduction in AI inference over just two years, from $33 per million tokens for ChatGPT-4 to 9 cents for ChatGPT-4o mini. This dramatic cost reduction enables much broader access to AI capabilities.
Evidence
Specific cost data showing the reduction from $33 to $0.09 per million tokens represents a 99% cost reduction in about two years, creating a cost backdrop that allows access for everyone.
Major discussion point
Compute Capacity and Demand Dynamics
Topics
Economic | Development
Agreed with
– Michael Intrator
Agreed on
Efficiency improvements will accelerate rather than reduce infrastructure demand
OpenAI exploring diverse financing including equity rounds, warrant structures with chip companies, and value-sharing models
Explanation
Friar describes OpenAI’s creative approach to financing their massive compute needs through various mechanisms including traditional equity, innovative warrant structures with chip companies, and potential value-sharing arrangements in sectors like drug discovery.
Evidence
OpenAI completed the largest equity round ever at $41 billion, created an AMD warrant structure where they could own up to 11% of AMD if they buy enough chips, and are exploring partnerships with NVIDIA to fund gigawatts of chip purchases. They’re also considering value-sharing models like taking licenses on drugs discovered using their AI.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Infrastructure
Demand is real and measurable, not circular financing – users show clear value and usage patterns with frontier users at 7x average consumption
Explanation
Friar refutes suggestions of circular financing by pointing to concrete evidence of real demand, including measurable usage patterns where frontier users consume 7x more tokens than average users, and frontier countries use 3x more tokens than others.
Evidence
ARR has grown from $2 billion to $6 billion to over $20 billion. Frontier users are using 7x what average users consume because they’re coding and doing deep research. Frontier countries are using 3x the amount of tokens. Consumer business is hitting daily highs and enterprise business is growing rapidly.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Development
Disagreed with
– Jessica Lessin
Disagreed on
Risk assessment of circular financing in AI infrastructure
AI training involves complex combination of pre-training, post-training, and test-time compute – more sophisticated than simple reductionist views
Explanation
Friar explains that AI development is much more complex than simple pre-training, involving post-training for reasoning capabilities and test-time compute that can happen at inference or even device level. These components work combinatorially, not additively.
Evidence
OpenAI’s O-series models (post-training reasoning) are still at version 1 while their GPT models are at version 5, showing there’s still significant development potential. Test-time compute can happen at inference or device level, and the effects are combinatorial rather than simply additive.
Major discussion point
Technology Evolution and Competitive Landscape
Topics
Infrastructure | Economic
Disagreed with
– Peng Xiao
Disagreed on
China’s competitive position in AI development
Need to build community trust and communicate benefits in accessible language rather than technical jargon
Explanation
Friar emphasizes the importance of building trust with communities and communicating AI benefits in practical, relatable terms rather than technical language. She stresses the need to address community concerns about data centers and their impact.
Evidence
She gives the example of explaining AI to a mother of a diabetic child in terms of taking a photo of the fridge to get a healthy 30-minute recipe, rather than using technical terms. She also addresses community concerns about data centers bringing jobs versus using water or raising electricity prices.
Major discussion point
Risks and Challenges
Topics
Sociocultural | Development
Disagreed with
– Jessica Lessin
Disagreed on
Timing and appropriateness of OpenAI’s move into advertising revenue models
Audience
Speech speed
101 words per minute
Speech length
33 words
Speech time
19 seconds
Environmental concerns about AI scaling faster than planetary regeneration capabilities regarding heat, water, energy and carbon
Explanation
An audience member raised concerns about the environmental impact of rapidly scaling AI infrastructure, questioning whether the pace of AI development is sustainable given planetary resource constraints and regeneration capabilities.
Major discussion point
Risks and Challenges
Topics
Development | Infrastructure
Jessica Lessin
Speech speed
171 words per minute
Speech length
1097 words
Speech time
383 seconds
Questions whether circular financing deals create systemic risk in AI infrastructure investments
Explanation
Lessin raises concerns from the financial industry about whether the complex financing arrangements in AI infrastructure amount to circular deals that could create risk in the system. She acknowledges this is a common criticism in her industry while seeking reactions from the panelists.
Evidence
She references the habit in her industry of calling these circular financing deals and asks Sarah Friar directly how she thinks about potential risks.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Infrastructure
Disagreed with
– Sarah Friar
Disagreed on
Risk assessment of circular financing in AI infrastructure
Challenges whether breakthroughs in AI efficiency could fundamentally change the infrastructure opportunity
Explanation
Lessin questions whether advances in AI model training efficiency, potentially demonstrated by examples like DeepSeek or other research breakthroughs, could reduce the need for massive infrastructure buildouts. She explores whether the infrastructure opportunity remains limitless if AI becomes much more efficient to train and run.
Evidence
She mentions a conversation with an entrepreneur who claimed they could train models much more efficiently than everyone else, and references speculation around DeepSeek’s efficiency improvements.
Major discussion point
Technology Evolution and Competitive Landscape
Topics
Infrastructure | Economic
Probes competitive concerns about OpenAI’s early move into advertising revenue models
Explanation
Lessin questions OpenAI’s timing in pursuing advertising revenue, referencing criticism from competitors who suggest it may be premature and indicative of financial pressure. She seeks to understand the strategic rationale behind this business model expansion.
Evidence
She specifically mentions tweets from competitors earlier that day questioning why OpenAI is going to advertising so soon and suggesting they must need the money.
Major discussion point
Financing and Business Models for AI Infrastructure
Topics
Economic | Development
Disagreed with
– Sarah Friar
Disagreed on
Timing and appropriateness of OpenAI’s move into advertising revenue models
Seeks to understand what risks could derail the optimistic AI infrastructure growth trajectory
Explanation
Lessin acknowledges the very positive outlook presented by all panelists but probes for potential obstacles or risks that could create significant speed bumps in AI infrastructure development. She wants to balance the rosy picture with realistic assessment of challenges.
Evidence
She notes that the panel has presented a very rosy and exciting picture but asks what could go wrong or throw bigger speed bumps in the way.
Major discussion point
Risks and Challenges
Topics
Infrastructure | Economic
Agreements
Agreement points
AI development is still in very early stages with massive growth potential ahead
Speakers
– Rob Goldstein
– Sarah Friar
Arguments
AI transformation is still in very early stages, with massive productivity opportunities ahead – limitless potential for next 10-20 years
OpenAI has diversified from single partnerships to multi-dimensional infrastructure, products, and business models – still just getting started
Summary
Both speakers emphasize that despite rapid progress, AI development is still in its infancy with enormous untapped potential. Goldstein uses the baseball metaphor that ‘the national anthem is still happening’ while Friar describes how even with current capabilities, there’s massive productivity overhead yet to be realized.
Topics
Infrastructure | Economic | Development
Demand for AI compute capacity far exceeds current supply constraints
Speakers
– Michael Intrator
– Sarah Friar
Arguments
Market is supply-constrained with demand exceeding global capacity – multiple companies could consume all available compute
Compute is the binding constraint and core competitive advantage – more compute would enable faster product development and breakthroughs
Summary
Both speakers agree that compute capacity is the fundamental bottleneck limiting AI development. Intrator notes that multiple companies could each consume the world’s entire compute capacity, while Friar states that more compute would directly translate to faster product development and breakthrough timelines.
Topics
Infrastructure | Economic
Efficiency improvements will accelerate rather than reduce infrastructure demand
Speakers
– Michael Intrator
– Sarah Friar
Arguments
Inevitable order-of-magnitude efficiency improvements coming within 5 years – step functions will accelerate rather than replace the business
Cost reductions are dramatic (ChatGPT-4 from $33 to 9 cents per million tokens) enabling broader access and new use cases
Summary
Both speakers argue that technological breakthroughs and cost reductions will increase rather than decrease demand for AI infrastructure. Intrator emphasizes that the ‘voracious appetite for intelligence is limitless’ while Friar shows how 99% cost reductions enable broader access and new applications.
Topics
Infrastructure | Economic | Development
Physical infrastructure constraints are major bottlenecks requiring massive buildouts
Speakers
– Michael Intrator
– Peng Xiao
Arguments
Physical infrastructure constraints create bottlenecks – need for skilled trades workers, power, concrete, copper in massive scale buildouts
UAE building 5 gigawatt AI campus with over 7,000 construction workers and 100 cranes, leveraging national energy advantage
Summary
Both speakers highlight the massive physical infrastructure requirements for AI development. Intrator emphasizes the need for skilled trades workers and traditional construction materials, while Peng demonstrates this with UAE’s concrete example of 7,000 construction workers building gigawatt-scale capacity.
Topics
Infrastructure | Development
Similar viewpoints
All three speakers share optimism about AI’s transformative economic potential and reject concerns about market bubbles or circular financing. They view technological disruptions as ultimately positive catalysts for growth rather than threats to the infrastructure opportunity.
Speakers
– Rob Goldstein
– Michael Intrator
– Sarah Friar
Arguments
Productivity step functions from AI acceleration are fundamentally positive for economic growth despite short-term disruptions
Inevitable order-of-magnitude efficiency improvements coming within 5 years – step functions will accelerate rather than replace the business
Demand is real and measurable, not circular financing – users show clear value and usage patterns with frontier users at 7x average consumption
Topics
Economic | Infrastructure | Development
Both speakers represent organizations making massive capital commitments to AI infrastructure through innovative financing and partnership structures. They demonstrate practical approaches to scaling AI infrastructure through strategic alliances and creative financial engineering.
Speakers
– Sarah Friar
– Peng Xiao
Arguments
OpenAI exploring diverse financing including equity rounds, warrant structures with chip companies, and value-sharing models
UAE building 5 gigawatt AI campus with over 7,000 construction workers and 100 cranes, leveraging national energy advantage
Topics
Economic | Infrastructure
Both speakers emphasize the importance of widespread AI adoption and community engagement. Peng highlights China’s aggressive adoption model while Friar stresses the need to build trust and communicate benefits in accessible terms to drive broader adoption.
Speakers
– Peng Xiao
– Sarah Friar
Arguments
China leads in ‘ruthless adoption’ of AI across society, creating demand for inference capacity despite model limitations
Need to build community trust and communicate benefits in accessible language rather than technical jargon
Topics
Sociocultural | Development
Unexpected consensus
Geopolitical risks as primary threat to AI infrastructure development
Speakers
– Michael Intrator
– Peng Xiao
Arguments
Geopolitical interactions pose biggest risks through protectionism, capital cost distortions, and market accessibility issues
Western conversations becoming too self-referential while missing lessons from global AI adoption patterns
Explanation
Despite representing different regions (US-based CoreWeave and UAE-based G42), both speakers converge on geopolitical fragmentation as a major risk. This consensus is unexpected given their different geographic positions and suggests broad recognition that AI development requires global cooperation.
Topics
Legal and regulatory | Economic
Community trust and practical communication as critical success factors
Speakers
– Sarah Friar
– Peng Xiao
Arguments
Need to build community trust and communicate benefits in accessible language rather than technical jargon
China leads in ‘ruthless adoption’ of AI across society, creating demand for inference capacity despite model limitations
Explanation
Unexpectedly, both the OpenAI CFO and UAE AI leader emphasize grassroots adoption and community engagement over pure technological advancement. This suggests recognition that AI success depends as much on social acceptance as technical capability.
Topics
Sociocultural | Development
Overall assessment
Summary
The speakers demonstrate remarkable consensus on fundamental issues: AI development is in early stages with massive growth potential, compute capacity is the primary constraint, efficiency improvements will increase rather than decrease demand, and physical infrastructure buildouts are essential. They also agree on the need for creative financing solutions and community engagement.
Consensus level
Very high consensus level with no significant disagreements on core issues. This strong alignment among industry leaders from different sectors (finance, infrastructure, AI development, and international markets) suggests robust confidence in AI infrastructure investment thesis and indicates coordinated industry perspective on scaling challenges and opportunities.
Differences
Different viewpoints
Timing and appropriateness of OpenAI’s move into advertising revenue models
Speakers
– Jessica Lessin
– Sarah Friar
Arguments
Questions whether circular financing deals create systemic risk in AI infrastructure investments
Probes competitive concerns about OpenAI’s early move into advertising revenue models
Need to build community trust and communicate benefits in accessible language rather than technical jargon
Summary
Lessin questions whether OpenAI’s move into advertising is premature and suggests financial pressure, while Friar defends it as strategic diversification at appropriate scale with principled approach
Topics
Economic | Development
Risk assessment of circular financing in AI infrastructure
Speakers
– Jessica Lessin
– Sarah Friar
Arguments
Questions whether circular financing deals create systemic risk in AI infrastructure investments
Demand is real and measurable, not circular financing – users show clear value and usage patterns with frontier users at 7x average consumption
Summary
Lessin raises concerns about systemic risk from complex AI financing arrangements, while Friar argues the demand is demonstrably real and measurable
Topics
Economic | Infrastructure
China’s competitive position in AI development
Speakers
– Peng Xiao
– Sarah Friar
Arguments
China excels at engineering optimization but lags in fundamental research – adoption leadership could create competitive advantages
China leads in ‘ruthless adoption’ of AI across society, creating demand for inference capacity despite model limitations
AI training involves complex combination of pre-training, post-training, and test-time compute – more sophisticated than simple reductionist views
Summary
Peng emphasizes China’s strengths in adoption and engineering while acknowledging research limitations, while Friar focuses on the technical complexity that may favor advanced research capabilities
Topics
Infrastructure | Development
Unexpected differences
Western self-referential thinking versus global perspective
Speakers
– Peng Xiao
– Other panelists
Arguments
Western conversations becoming too self-referential while missing lessons from global AI adoption patterns
Explanation
Peng’s criticism that Western AI discussions are becoming insular was unexpected given the international composition of the panel, suggesting deeper philosophical differences about learning from different development models
Topics
Sociocultural | Development
Environmental sustainability concerns
Speakers
– Audience
– All panelists
Arguments
Environmental concerns about AI scaling faster than planetary regeneration capabilities regarding heat, water, energy and carbon
Explanation
The audience question about environmental sustainability created an unexpected tension with the panelists’ uniformly optimistic growth projections, highlighting a gap between industry enthusiasm and sustainability concerns
Topics
Development | Infrastructure
Overall assessment
Summary
The panel showed remarkable consensus on AI’s transformative potential and infrastructure needs, with disagreements primarily around business model timing, financing risk assessment, and competitive positioning rather than fundamental technology or market direction
Disagreement level
Low to moderate disagreement level with high strategic implications – while panelists agreed on the massive opportunity and early stage of development, their different perspectives on financing risks, competitive dynamics, and global adoption patterns could significantly influence investment and policy decisions in this rapidly evolving sector
Partial agreements
Partial agreements
Similar viewpoints
All three speakers share optimism about AI’s transformative economic potential and reject concerns about market bubbles or circular financing. They view technological disruptions as ultimately positive catalysts for growth rather than threats to the infrastructure opportunity.
Speakers
– Rob Goldstein
– Michael Intrator
– Sarah Friar
Arguments
Productivity step functions from AI acceleration are fundamentally positive for economic growth despite short-term disruptions
Inevitable order-of-magnitude efficiency improvements coming within 5 years – step functions will accelerate rather than replace the business
Demand is real and measurable, not circular financing – users show clear value and usage patterns with frontier users at 7x average consumption
Topics
Economic | Infrastructure | Development
Both speakers represent organizations making massive capital commitments to AI infrastructure through innovative financing and partnership structures. They demonstrate practical approaches to scaling AI infrastructure through strategic alliances and creative financial engineering.
Speakers
– Sarah Friar
– Peng Xiao
Arguments
OpenAI exploring diverse financing including equity rounds, warrant structures with chip companies, and value-sharing models
UAE building 5 gigawatt AI campus with over 7,000 construction workers and 100 cranes, leveraging national energy advantage
Topics
Economic | Infrastructure
Both speakers emphasize the importance of widespread AI adoption and community engagement. Peng highlights China’s aggressive adoption model while Friar stresses the need to build trust and communicate benefits in accessible terms to drive broader adoption.
Speakers
– Peng Xiao
– Sarah Friar
Arguments
China leads in ‘ruthless adoption’ of AI across society, creating demand for inference capacity despite model limitations
Need to build community trust and communicate benefits in accessible language rather than technical jargon
Topics
Sociocultural | Development
Takeaways
Key takeaways
AI transformation is still in very early stages with limitless potential for productivity gains over the next 10-20 years
Physical infrastructure constraints (skilled workers, power, materials) are the primary bottlenecks to AI scaling, not technological limitations
The market is supply-constrained with compute demand far exceeding global capacity – multiple companies could consume all available compute resources
Dramatic cost reductions in AI inference (from $33 to 9 cents per million tokens) are enabling broader access and new use cases
China leads in AI adoption across society despite lagging in fundamental research, creating lessons for Western markets
Creative financing models and strategic partnerships are emerging as necessary tools to fund massive infrastructure buildouts
Order-of-magnitude efficiency improvements are inevitable within 5 years, but will accelerate rather than replace the need for infrastructure
Real user demand is driving the market – not speculative or circular financing – with frontier users consuming 7x average usage
AI development involves complex combinations of pre-training, post-training, and test-time compute that are more sophisticated than simple reductionist views
Resolutions and action items
OpenAI continuing to diversify infrastructure partnerships across multiple cloud service providers and chip manufacturers
UAE proceeding with 5 gigawatt AI campus construction with over 7,000 workers and 100 cranes
G42 targeting production of over 1 billion AI agents by end of year to boost UAE GDP
OpenAI’s first inference-specific chip has taped out and is moving toward production
BlackRock creating investment vehicles that bring together strategic partners like MGX, Microsoft, and Nvidia as GPs
OpenAI publishing advertising principles before testing ad-supported models for free users
Unresolved issues
Environmental sustainability concerns about AI scaling faster than planetary regeneration capabilities regarding heat, water, energy and carbon
How to effectively build community trust and communicate AI benefits in accessible language rather than technical jargon
Geopolitical risks including protectionism, capital cost distortions, and market accessibility issues
Whether Western markets are becoming too self-referential and missing lessons from global AI adoption patterns
Long-term implications of the skilled trades worker shortage for data center construction
Regulatory framework development for AI infrastructure and energy consumption
How to balance rapid AI development with community concerns about local impacts
Suggested compromises
OpenAI’s approach to advertising that maintains model output integrity while creating ad-free options for users who prefer them
Hybrid infrastructure approach combining cloud service provider partnerships with selective build-to-suit facilities rather than full vertical integration
Value-sharing business models like taking licensing stakes in drug discovery outcomes rather than just charging usage fees
Ecosystem collaboration where each player contributes their expertise (energy, construction, financing, technology) rather than competing across all areas
Balancing rapid infrastructure buildout with community engagement and environmental responsibility
Thought provoking comments
I think the national anthem is still happening. I don’t even think this has started yet. And if you think about the cycle as being sort of buildout, then adoption, then transformation, we’re still early in the buildout stage.
Speaker
Rob Goldstein
Reason
This baseball metaphor powerfully reframes the entire AI discussion by suggesting we’re not even at the beginning of the game yet. It challenges the common narrative about AI being mature or potentially overheated, instead positioning current developments as pre-game activities.
Impact
This comment set the optimistic tone for the entire panel and became a recurring reference point. Other panelists built on this theme throughout, with Sarah Friar later using electricity analogies and Michael Intrator referencing generational change on the scale of ‘the wheel, intelligence.’ It shifted the conversation from questioning AI’s sustainability to exploring its unlimited potential.
In the long term, the cost of intelligence will equal the cost of energy, eventually. This is a unique advantage why we are drawn, G42 and UAE Abu Dhabi, drawn into this business because we have a national advantage in energy production.
Speaker
Peng Xiao
Reason
This insight fundamentally reframes AI infrastructure as an energy arbitrage play and introduces geopolitical dimensions to AI competitiveness. It suggests that nations with energy advantages will have structural advantages in the AI economy.
Impact
This comment introduced the geopolitical and resource-based competitive dynamics into the discussion. It helped explain why certain regions are investing heavily in AI infrastructure and set up later discussions about global AI competition, particularly with China.
It’s not the infrastructure that’s endless, it’s the voracious appetite for intelligence that is limitless. That’s the part that we have to focus on. And every single step function in the foreseeable future… are going to do nothing more than accelerate the business.
Speaker
Michael Intrator
Reason
This comment cuts through the technical complexity to identify the core economic driver – insatiable demand for intelligence. It also addresses concerns about efficiency improvements (like DeepSeek) potentially reducing infrastructure needs by arguing they actually increase demand.
Impact
This reframing helped resolve a key tension in the discussion about whether efficiency improvements would reduce infrastructure needs. It reinforced the bullish infrastructure thesis and provided a framework for understanding how technological breakthroughs accelerate rather than cannibalize the business.
I’m a bit worried about many of our conversations here in the West becoming self-referential. I want to point out another country called China. They may not have the best models in the world, the best computing in the world, but they’re doing what you termed as ruthless adoption.
Speaker
Peng Xiao
Reason
This comment challenges Western-centric thinking and introduces a crucial competitive dynamic. It suggests that adoption velocity, not just technological superiority, could determine AI leadership, fundamentally shifting how success in AI should be measured.
Impact
This observation broadened the discussion beyond technical capabilities to include adoption strategies and competitive dynamics. It influenced Sarah Friar’s response about democratizing access and added urgency to the infrastructure buildout discussion by highlighting competitive pressures.
There’s nothing beats a good business model. First and foremost, cash flow is king… ARR has gone from $2 billion to $6 billion to over $20 billion just in the last… There’s never been a company like it.
Speaker
Sarah Friar
Reason
This grounds the entire AI infrastructure discussion in business fundamentals, providing concrete evidence that the demand and revenue growth can support the massive infrastructure investments being discussed. The scale of growth described is unprecedented.
Impact
This comment provided crucial validation for the infrastructure investment thesis by demonstrating that revenue growth is keeping pace with infrastructure needs. It helped address concerns about ‘circular financing’ and established that the demand driving infrastructure investment is real and measurable.
What we’re doing exists in a broader backdrop and there are geopolitical… interactions that can distort the market… And if I were to put on the top of my list what I think about from a how I manage risk at our company… that’s got to be first and foremost.
Speaker
Michael Intrator
Reason
This comment introduces the sobering reality that technological and business fundamentals, while strong, exist within a complex geopolitical environment that could disrupt the entire AI infrastructure buildout through policy, trade restrictions, or international tensions.
Impact
This was the first major cautionary note in an otherwise optimistic discussion, grounding the conversation in real-world risks. It acknowledged that despite strong fundamentals, external factors could significantly impact the AI infrastructure opportunity.
Overall assessment
These key comments shaped the discussion by establishing a framework that moved from unbounded optimism to nuanced realism. Goldstein’s ‘national anthem’ metaphor set an expansive tone that influenced how other panelists framed their responses. Peng’s insights about energy costs and Chinese adoption introduced crucial competitive and geopolitical dimensions that elevated the conversation beyond pure technology discussion. Intrator’s focus on insatiable demand for intelligence provided a unifying economic theory, while Friar’s business model validation grounded the theoretical discussion in concrete financial reality. The progression from limitless opportunity to geopolitical risks created a comprehensive view that acknowledged both the transformative potential and real-world constraints of AI infrastructure development. The interplay between these perspectives created a rich discussion that avoided both naive optimism and unfounded pessimism.
Follow-up questions
How will the physical constraints of AI infrastructure buildout (power, concrete, copper, skilled trades) be addressed to meet the exponential demand?
Speaker
Michael Intrator
Explanation
Intrator highlighted the critical bottleneck of physical infrastructure requirements, noting the shortage of skilled electricians and the massive physical buildout needed, which represents a fundamental constraint on AI scaling
What will be the long-term impact of China’s ‘ruthless adoption’ strategy on global AI competitiveness?
Speaker
Peng Xiao
Explanation
Xiao warned that Western conversations are becoming self-referential while China is aggressively adopting AI across society, potentially creating competitive advantages through usage data and practical implementation
How will geopolitical factors and protectionism affect AI infrastructure development and market access?
Speaker
Michael Intrator
Explanation
Intrator identified geopolitical interactions as his top risk concern, noting they could distort markets through protectionism, cost of capital changes, and accessibility restrictions
What are the environmental sustainability implications of AI scaling, particularly regarding heat, water, energy, and carbon regulation?
Speaker
Audience member
Explanation
An audience member raised concerns about AI scaling faster than planetary systems can regenerate, questioning which environmental constraint will hit first and who should be responsible for regulation
How can the AI industry build trust with local communities affected by data center development?
Speaker
Sarah Friar
Explanation
Friar emphasized the need to address community concerns about data centers’ impact on jobs, water usage, electricity prices, and overall community welfare to build necessary trust
What specific engineering advantages is China using to maximize efficiency from limited AI infrastructure?
Speaker
Jessica Lessin (prompted by entrepreneur conversation)
Explanation
Lessin raised questions about whether China has discovered fundamental training efficiencies that could change the infrastructure equation, though Xiao suggested it’s more about engineering optimization than breakthrough research
How will order-of-magnitude efficiency improvements in AI technology affect infrastructure demand over the next five years?
Speaker
Michael Intrator
Explanation
Intrator emphasized that step-function improvements in efficiency are inevitable and will accelerate rather than reduce demand for AI infrastructure, requiring strategic planning around this assumption
What will be OpenAI’s strategy for owning versus partnering on data center infrastructure in three years?
Speaker
Sarah Friar
Explanation
Friar indicated OpenAI is on a journey from pure partnerships to potentially building their own data centers, but the timeline and extent of this transition remains unclear
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event

World Economic Forum Annual Meeting 2026 at Davos
19 Jan 2026 08:00h - 23 Jan 2026 18:00h
Davos, Switzerland
