290: Why Airbnb switched from OpenAI to Chinese AI (and what it means for your budget)

ai ai and big data tech trends Feb 11, 2026

AI isn’t just coming from Silicon Valley anymore.

A growing number of startups — and companies like Airbnb — are turning to Chinese open-source AI models instead of US-based APIs. Not because it’s trendy. Because it’s cheaper, more flexible, and often good enough.

In this episode, Sophia Matveeva speaks with Alex Hern, AI correspondent at The Economist, about what’s driving this shift.

They break down how DeepSeek disrupted the market, why constraints fueled smarter engineering, and what founders can realistically try today if they want more AI options without more spend.

Alex Hern is The Economist’s AI Writer, focusing on the science and technology of artificial intelligence. Before joining the paper, he covered technology for 11 years at The Guardian, where he was the UK technology editor.

In this episode, you will hear:

  • Why relying on US AI APIs may be quietly limiting your product and your margins
  • How Chinese open-source models let founders experiment, customize, and ship faster without runaway costs
  • The real reason DeepSeek shocked Silicon Valley — and what it reveals about building under constraints
  • What you can realistically try today if you want AI leverage without an AI-sized budget

Free AI Mini-Workshop for Non-Technical Founders

Learn how to go from idea to a tested product using AI — in under 30 minutes.

Get free access here: techfornontechies.co/aiclass

Follow and Review:

We’d love for you to follow us if you haven’t yet. Click that purple '+' in the top right corner of your Apple Podcasts app. We’d love it even more if you could drop a review or 5-star rating over on Apple Podcasts. Simply select “Ratings and Reviews” and “Write a Review” then a quick line with your favorite part of the episode. It only takes a second and it helps spread the word about the podcast.

Listen to Tech for Non-Techies on:

 

 

 

TRANSCRIPT

[00:00:00] Alex Hern: Understanding that the world of AI systems is so much bigger than kind of the offerings from three large American labs. That's, that's the best thing to get into your head. It, it's, you're not just waiting for a big company to release something you need. There is probably someone who's act about with an open source model and, and made it already.

 

[00:00:27] Sophia Matveeva: Hello and welcome to the Tech for Non-Techies Podcast. I'm your host, Sophia Matveeva. If you're a non-technical founder building a tech product or adding AI to your business, you are in the right base. Each week you'll get practical strategies. Step-by-step playbooks and real world case studies to help you launch and scale a tech business without learning to code.

 

[00:00:51] And this is not another startup show full of jargon, venture capital, theater or tech, bro bravado. Here we focus on building useful product. That make money without height and without code. I've written for the Harvard Business Review and lectured at Oxford London Business School and Chicago Booth, so you are in safe hands.

 

[00:01:11] I've also helped hundreds of 

 

[00:01:13] Alex Hern: founders, both from concept to scalable product, and now it's your turn. So let's dive in. 

 

[00:01:20] Sophia Matveeva: Hello, smart people. How are you today? Before we get started, I have a quick request. If you've listened to a few episodes of this podcast and you haven't yet left a rating and a review, I'm asking you to do it now.

 

[00:01:35] It really, really helps the show get discovered by other ambitious founders and business leaders like you, and it helps me and my team know that we're creating content that actually serves you. It takes less than a minute, and it makes a huge difference. So thank you in advance. Now, today's episode is something slightly different.

 

[00:01:53] I'm bringing you a conversation with Alex Hearn who covers AI at The Economist, and we're talking about something that could change how you build your tech product, and that is Chinese open source AI models. So here's what's happening. Companies like Airbnb are switching from American AI providers like OpenAI to Chinese models from companies like Alibaba, and that's because basically these models are much cheaper or they're free, they're much more customizable, and in many cases, they're basically almost as good as the frontier models.

 

[00:02:27] So if you're a founder or a business leader, this is important for you because basically it means you have more options and potentially much lower costs when building AI into your product. And in this conversation, Alex and I cover why open source Chinese models are attracting startups and enterprises all around the world.

 

[00:02:46] We cover what deep seeks disruption last year taught us about innovation under constraints. We also cover where Chinese AI companies are actually making money. Or are they? And most importantly, what you should actually try today, if you want to experiment with these AI models. So if you're building something with AI or you're thinking about it, this episode will open your eyes to whole world beyond Chat g, pt, and Claude.

 

[00:03:13] So here's my conversation with Alex Hearn from The Economist.

 

[00:03:20] Alex, I heard that Airbnb has switched from using American AI models like Open AI to using Alibaba, a Chinese big TED behemoth for their customer service. So why is that happening and what does this mean for the broader sector in the us? 

 

[00:03:40] Alex Hern: The sure reason why it's happening is that Alibaba makes open source models.

 

[00:03:44] The entire Chinese AI sector has been big on open source as its way of competing with the large American labs. If you are trying to buy a product from Anthropic or open AI or Google, then what they're trying to sell you is API access to their cloud hosted. AI tools. That's great in a lot of ways, right?

 

[00:04:08] It means you don't need to worry about provisioning it. It means you only pay for what you use. It means you can tinker around to a certain extent, but if you're a big company or if you have. Quite specific engineering needs that might not be enough. Using an open source model for a company like Airbnb has a lot of advantages.

 

[00:04:27] It can be cheaper because you can download the model and you can run it on whatever hardware you can get your hands on, or you can turn to a cloud provider that's happy to undercut a company like Open Ail, Andro. It also means that you can. Hack around with it, right? If you want to fine tune the AI model to, I don't know, speak in, uh, exactly the version of English that you want, then you can do that with an open source model.

 

[00:04:52] API tools from, from companies like OpenAI offer some stuff like it and with, you know, with Canmy prompting, you can do an awful lot, but it's not the same as changing the underlying model. And that's what an open source offering from a company like Alibaba. Lets you do. It's worth saying that some of the American labs now have their own open source offerings.

 

[00:05:13] I think open AIS is called G-P-T-O-S-S, and Google sells its Gemma open source model. Those are the best things the companies make, whereas Alibaba. Its top tier frontier model, uh, called Quinn is open source. You can download the weights and run it on whatever hardware you have. That's, that's the appeal to companies like Airbnb.

 

[00:05:35] Sophia Matveeva: So for those listeners who have not listened to our amazing lessons on what an API is and what open source software, I think the analogy and. Tell me, Alex, what you think of this analogy is that essentially with the American companies like OpenAI, you essentially buy access to their thing. So they've created like this, let's, let's just be romantic.

 

[00:05:55] They've created this big castle, and inside this big castle there's lots and lots of treasure. And then if you pay them, then they open the door to the castle and then you can access that treasure, but you can't take it home. You can just kind of look at it and use it. Whereas with the Chinese models, it's open source, which means that actually.

 

[00:06:12] You can kind of take the whole castle to you and then you can customize it. I dunno. You can turn it into a set of apartments if you want. And then that's the, that's basically the difference. That one, it basically allows you to, to plug into itself and therefore doesn't really change that much. Whereas the other one, you can import it and change that, and change it as 

 

[00:06:34] Alex Hern: exactly.

 

[00:06:34] I mean, an an even simpler one that a lot of people might grasp, right, is it's, it's the difference between watching something streaming from Netflix and having the Blu-ray in your house. The, the Netflix version is gonna be more convenient. It's, you know, you don't need to worry about where you keep it.

 

[00:06:48] You don't need to have a physical player or the disc drive all this, but at the same time, you have to pay monthly to Netflix forever, or you lose access to it. And if you are an absolute diehard audio file, the quality's probably not quite as good as what you could get if you wanted it. In your house.

 

[00:07:03] And so for some people in that world, buying physical media is still, still a thing. You can see downloading and provisioning and an open source model on your own hardware as, as akin to that, right? 

 

[00:07:14] Sophia Matveeva: So does this mean that basically now everybody is keen to use the open source models? Because like, I'm just thinking about this as a business owner myself, and I'm thinking, okay, if it's, if I can customize it and it's cheaper, that's just a better choice.

 

[00:07:30] Alex Hern: So there's a good few reasons to stay working with the large American API labs, right? The first is that an API is convenient. In that Netflix analogy, it, the easiest way of watching TV by far is Netflix. You don't have to stand up. When something ends, it just carries on running. The same is true for these.

 

[00:07:47] The trouble, 

 

[00:07:48] Sophia Matveeva: Alex, 

 

[00:07:48] Alex Hern: exactly the same is true for these API offerings, right? They are deliberately made to be the easiest possible thing to get up and running. Now, you can run open source models through APIs as well. There's a bit more of a spectrum here than I'm, I'm presenting it as, but if you want to build something with a very powerful AI model, very quickly.

 

[00:08:08] Then you can just pay OpenAI and they will give you access. They have customer support, they want your money, and they'll happily take it. The reason for going to an open source system is if you want that flexibility and also bluntly, if you don't need the absolute best model in the world available right now.

 

[00:08:27] 'cause currently those are still from American Labs and they're still kept behind lock and key. You'll have heard people talking about Claude Code recently, right? The reason why they're doing that is because Anthropics latest model, Claude Opus 4.5, is incredible. It's the most human-like competent AI system I've ever interacted with, and Anthropic does not let you download that and run it on their own hardware.

 

[00:08:51] If you want to use Opus 4.5, you are speaking to philanthropic and you're paying them for the privilege. 

 

[00:08:56] Sophia Matveeva: So interesting. So let's go back to, I was gonna say last year would be last year when Deep seek disrupted everything. So it was basically about this time last year and. Basically it came on the market.

 

[00:09:09] It was way cheaper and that there was this massive kind of shakeup in the US market. And you know, investors were asking, Hey, hang on a second. Why are we funding, why, why are we spending so much money on you guys when actually the, these guys have made it so much cheaper? So how did Deep Seek managed to create something that was.

 

[00:09:29] You know, very good for just so much less money than open ai. Is it just because, you know, in Silicon Valley they throw money around. 

 

[00:09:39] Alex Hern: That does seem to be a little part of the story. Right? As of this time last year, it definitely felt like large AI labs in America were swimming in money and they were doing fantastic cutting edge research, but that research was about how to use.

 

[00:09:55] As much computing power as you can get your hands on to produce the strongest possible model. Deep Seek was doing research on how to use a very constrained number of Nvidia chips to produce a strong model, but one that could be done with the 20,000 or so chips that they could get their hands on. This was even before the, the real embargoes from America started to bite.

 

[00:10:18] But nonetheless, deeps seek was supported by a bunch of. Computing hardware that the hedge fund it's attached to have managed to get its hands on. Wanting it more is a big part of it. But then the sort of less favorable side of the narrative for the Chinese companies is that there's an element of fast follower in what Deeps Seq was able to do.

 

[00:10:37] For instance, there is a, a technique called distillation. That you hear about in the AI world sometimes, and what distillation is, is it's training an AI model on the outputs of another AI model. If you want an AI system that is able to produce responses like chat, GPT-4 0.1, which was the the pinnacle at the time from open ai.

 

[00:10:57] There's a lot of ways you can do that. You can try and mimic the research that you know they're doing. You can try and mimic the data mix that you know they've got in their training data. Or you can ask a million questions of GPT-4 0.1, save the answers and feed that into your own system as training data.

 

[00:11:12] And that seems to be a big part of what Deep Seek did. They distilled bigger, more powerful models into something that was accessible. Faster run and had the efficiency updates that they'd invented in their own, in their own labs. And I, I don't wanna do those efficiency updates down, right? They were coding hooks into these Nvidia chips that no one in America bothered to do.

 

[00:11:36] They were throwing away NVIDIA's own programming language 'cause it wasn't powerful enough. And writing, what's those tother metal, right? Sending specific instructions to specific. Components on these chips because that was the only way they could get the efficiency they needed out. And it was a, a phenomenal engineering achievement, but one that was driven out of need.

 

[00:11:56] Sophia Matveeva: So I think I'm seeing kind of two lessons in here. One is that constraints lead to innovation. Mean, everybody kind of knows that none of us like constraints, but next time you are constrained and you know you're running out of budget, you can be like, oh, well this is an opportunity to be creative. Um, so remember that.

 

[00:12:15] But another one is that I think it's not necessarily good to be the first mover. So when people talk about first mover advantage, well, is it really an advantage? Because, you know, for example, apple is a very successful company. It is very rarely a first mover in something. It usually sees. Trends and technologies develop, and then it kind of looks at things and thinks, well, hang on a second.

 

[00:12:38] We can make this more beautiful, we can make this nicer. And then it comes out with something incredible and expensive. So deep seek is not the expensive version, but it has definitely come out with something incredible. And so when you look at what's happening in AI in China in general. Is that kind of the trend that it's okay, we do things open source, we do things cheaper.

 

[00:13:04] Our innovation is really in. How do we take, you know, what's being made in the US and make it, you know, maybe 98% as good, but 80% of the price, for example. 

 

[00:13:18] Alex Hern: I mean, 80% is doing them down, right? Most of these systems are free to download. They are open source. One of the big questions that we've been looking at, the Economist, is how any of these Chinese companies actually pay the bills.

 

[00:13:29] If you're Alibaba, you've got a, you know, a huge e-commerce. Wing attached. You are fine if you are Baidu, you've got a search engine. If you're deep seek, if you are Z AI or Z ai, you know, one of these smaller startups, there are open questions as to whether you are going to actually be able to monetize this enough to fund further development.

 

[00:13:50] But I do think there you are. Right? Right. One of the questions here is kind of existential for. AI labs in the West and in China, which is just, is AI a sector of industry where you have a moat if you're first, or is it one where it's very expensive to have the ideas and do the research, and then everyone else can copy your homework?

 

[00:14:09] The reason why the deep seek shock was so big was it. Suggested that we might be in that second world, a world where it's really expensive to come up with AI breakthroughs because you've gotta try all of these failed approaches and then when you find the one that works, everyone can just go, oh, cool, I'll do that and, and I'll be there.

 

[00:14:26] It's certainly the case in the year since that we've not had many examples of Chinese labs taking the absolute frontier. We've also not had none. Some Chinese labs, particularly in specific use cases, Alibaba for instance, released an image model that could edit images. Better than any model from any Western company.

 

[00:14:49] Now that lead was Shortlived, Google's nano Banana Pro came out shortly after and took the top of the leaderboard again. But the race is real and, and you are, I, I think, I think one could be lulled into a sense of complacency if we thought the narrative was just around fast following. These companies are taking the lead occasionally, infrequently.

 

[00:15:08] They're doing it with open source models, which has its own hindrances in terms of what you can actually offer, and they're doing it. You don't take the lead on a leaderboard by distilling down a better model because you could only ever go to second place with that approach. 

 

[00:15:21] Sophia Matveeva: And so how are these companies getting funded?

 

[00:15:24] So you mentioned in the the Chinese big tech companies, we understand where Alibaba's money, money comes from. Um, with Deep C, there's a massive hedge fund with lots of cash behind it. What about everybody else? If we're not, if we're not sure about what the business model is, it's open source who's paying the bills and why?

 

[00:15:42] Alex Hern: So currently, you know, the funding question is a lot, like most of the America labs, they're being funded by investors who are hoping to earn a profit down the line. That's an unsatisfactory answer 'cause the question is, why are they going to earn a profit? They do have revenue sources, so some of it is the classic offerings from open source.

 

[00:15:58] If you want to build something on zero AI's, GLM five a coding AI that's coming out shortly, then you. And pay them for a service contract, and their engineers will help you get it up and running. Or you can just use it through their API. Although they offer it for download to run for free, they also let you use their hardware.

 

[00:16:17] And if you're just tooling around, maybe it's worth it. In that narrative, you can see the open source releases as something akin to, uh, advertising. Yes, lots of people download and use it, but it doesn't cost the company any money if Airbnb uses their models and it means that they're known for making good models and maybe eventually they sell the API access themselves.

 

[00:16:38] I think there's kind of a bigger question here, which is. Where the open source is a short-term thing for these businesses. There's one potential future where kind of something like Deep Seek or Alibaba's Quinn becomes a household name, if not in the uk, that may be in China, maybe in one of the many other nations that these companies are targeting.

 

[00:16:58] And if it becomes a household name, then they get to move to a bit more of a traditional closed model. Perhaps they don't put their best versions up for open source or they charge quite. You know, substantial fees to download them. Those are all options that I'm sure they're considering. One of the other problems that they've got all of these companies is that within China, the software as a service market never really took off.

 

[00:17:22] There's, there's some historical reasons for that, right? Software piracy was massive in China around the era of boxed software sales, which meant that when moving to the cloud came up and paying a monthly fee per seat, Chinese enterprises kind of looked at what, well, why should we do that? We're not used to paying for software.

 

[00:17:38] We'll just carry on using the free stuff that we've downloaded anyway. And so now these AI labs are, are having to teach effectively their customers a whole new way of doing business. That's a real. Hindrance to earning cash compared to the American world where everyone knows how to provision a server on AWS and they can move straight to making profit or making less of a loss as it may be.

 

[00:18:03] Sophia Matveeva: So, you know, what we're seeing in the startup world, we're seeing that in the startup world, uh, startups who you know, are generally short on funds, are really loving the Chinese models in general, that they seem to be the power users of these models. Is that something that you are noticing as well? Kind of that it's.

 

[00:18:22] Okay. If you're really cash constrained, if you're, you know, three engineers building something new, then Chinese open source is really the way to go. You probably wouldn't even really think about the US models unless you are doing something that requires the super cutting edge, which most people don't.

 

[00:18:38] Alex Hern: The other thing to remember here is that we all have computers, right? Which is a, a glib thing, but it's, it's worth noting because running a model on your own laptop is as close to free as you get in the computing world. And that's important because you can't run an open AI model on your own laptop. The, you know, you, you are paying them and, and that's it.

 

[00:19:00] And so that means if you want to experiment, if you want to play around. It might not be much, right, if you're just doing little tests, but those tests can add up. And particularly if you are the sort of a company where you don't have just a little slush fund for engineers to run experiments, even if that is a couple of hundred pounds a month, then your engineers are gonna turn to something that they can run locally and they're, they're gonna turn to the Chinese models for that.

 

[00:19:24] I think that's a huge part of it, right? Just the ability to have stuff on your local hardware that you can prod, poke, and, and see where it. Where it gets you. 

 

[00:19:34] Sophia Matveeva: And I think this is a really interesting kind of platform for innovation because when you have people who are generally curious and you know, engineers who just like building things and like fiddling around with new software, then generally something interesting, new products.

 

[00:19:52] Interesting things are going to come outta this. This is very, very fertile ground for us to have basically some new products that are made with cheaper ingredients, which is gonna be great for all of us. So final question for you. So most of our audience is in the US and then the second market is in the, in the uk.

 

[00:20:09] So. What would you tell us? What should we try in terms of Chinese models? You know, and you are speaking to people who probably are using Chad GPT and are using Claude, but are not doing, you know, very, very complicated things with them. 

 

[00:20:24] Alex Hern: So I think in terms of the non-techie world right now. A lot of these models are very much targeting engineers.

 

[00:20:32] You can head to the website hugging face, which is a, uh, French American company, so French American. Yeah. Which hosts lots of these models that you can play around with to a certain extent. What you'll find when you are prodding and pulling them is that any given. AI system is gonna feel much like any others at that first, uh, you know, first contact with them, but seeing breadth.

 

[00:21:00] Offerings available from large Chinese labs and the downstream impact of being able to hack around with them, someone like hugging faces is a good place to experiment. So you'll see not only Alibaba's top flight coin models and its coin image models, but you'll also see versions of all of those that have been fine tuned to produce really good anime style visuals for the image models or to speak other languages or to run on mobile phones.

 

[00:21:25] All of anything that you can do with an AI model, you'll be able to find done to an open source model. And I think it's understanding that the world of AI systems is so much bigger than kind of the offerings from three large American labs. That's, that's the best thing to get into your head. It's you're not just waiting for a big company to release something you need.

 

[00:21:47] There is probably someone who's act about with an open source model and, and, and made it already. 

 

[00:21:53] Sophia Matveeva: That's a really cool, positive note to end on. Thank you very much, Alex. 

 

[00:21:57] Alex Hern: Thank you for having me.

 

[00:22:02] Sophia Matveeva: That was Alex Hearn who covers AI for The Economist, and I thought that was very interesting. So if you also found this conversation useful, and I hope you did, and I assume you did, because you are still here. So here's how you can help the show reach other ambitious founders and business leaders like you.

 

[00:22:18] So first hit subscribe so you don't miss next week's episode. Second, give us a five star rating. Wherever you get your podcast. It's literally takes 10 seconds and you don't even need to use your whole hand. You can just use your thumb and it genuinely helps the show get discovered. And if you want to go the extra mile, then leave a review telling us what you lu or what you clicked, or basically just what you like about the show, because it really helps me and my team know that we're creating something that is useful for you.

 

[00:22:46] So thank you very much in advance. And on that note, have a wonderful day and I shall be back. In your delight, my tears next week. Ciao.

Sign up to our mailing list!

Be the first to hear about offers, classes and events