'In my lifetime it's one of the biggest transformations I’ve seen'. A Samsung exec talks about Galaxy AI – and why the UI is the AI

Samsung Galaxy Z Fold 6 HANDS ON
(Image credit: Future / Lance Ulanoff)

When I first met Samsung's Patrick Chomet a year ago we talked about the company's new foldables, updated watches, and earbuds. We didn't talk about AI, and the concept of 'Galaxy AI' did not, at least for me, exist then. Now, as we sit talking in Paris at Samsung Galaxy Unpacked 2024, AI weaves through our discussion like a fine yet tensile thread, knitting in new capabilities on-device and in the cloud that will redefine how we use this next, emerging generation of Samsung Galaxy products.

Chomet, who as Executive VP and Head of Customer Experience chooses both product features and third-party features to bring to Galaxy devices, admitted to me that the pace and scale of change is unprecedented. "In my lifetime [this is] one of the biggest transformations I’ve seen. I went through feature-phone to smartphone transformation. I think this one is actually bigger."

Samsung fully entered the generative AI race, along with partner Google, when it introduced Galaxy AI with the Samsung Galaxy S24 line in January, and now it's bringing many of those features, including conversation translation, photo editing, and image generation, to its foldable line (Galaxy Z Fold 6, Galaxy Z Flip 6), its new earbuds (Galaxy Buds 3 and Galaxy Buds 3 Pro), and the brand-new Samsung Galaxy Ring (which Chomet noted was created in part because people, including him, don't like wearing their smartwatch while they're sleeping). 

SoI went through feature-phone to smartphone transformation. I think this one is actually bigger.

In the interim between these two launches, Apple finally entered the fray with Apple Intelligence, Apple's rebranded suite of ecosystem-level generative AI features. I wondered how Samsung viewed Apple Intelligence. Chomet laughed quietly, and told me that his PR team usually cautions him about speaking about competitors; but, without mentioning the company by name, he offered a rather scathing analysis of Apple's big move.

"We launched [the] intelligent S24, and… I don't like the name, but we have to have a name, so we call it Galaxy AI, which will represent all these intelligence functions across the different touch points. And then [a] competitor has a different name. We say we are hybrid, and I think they end up with [something] similar. First of all, ours is real. So we have this stuff deployed on over 100 million devices already and over 200 million by the end of the year. So you can see what it is, and you can try it. I don't know anything else from the market, to be honest, that is real."

Chomet was clearly referring to how not only is Apple Intelligence tied to platforms due to release later this year, like iOS 18, but the fact that Apple has indicated that some of the biggest changes, like on-device image generation, might not arrive until later in the year or even next year.

Patrick Chomet 2023

Patrick Chomet in 2023 (Image credit: Future / Lance Ulanoff)

Closer to home, though, Chomet is focused not just on Galaxy AI features on individual products; he sees this as an ecosystem play.

"The transformation to AI has to do with all services becoming intelligent but more deeply understanding customer context and intention, so that’s very profound. We want to deploy that at scale on all devices, including smartphones and others," Chomet told me, adding later that "We want to do it at an ecosystem level."

That ecosystem play is muddied slightly by the existence of Bixby, arguably among the group of early digital assistants, like Siri and Alexa, that are all due for significant upgrades. But Galaxy AI is not Bixby and vice versa. I asked Chomet where Bixby fits into this picture.

The Bixby question

Chomet told me it's important to remember that all of these early assistants, Bixby included, were based on powerful natural language processing (NLP). "Now what is happening with a large language, with the technology, it makes the NLP quite redundant, and LLM [large language model] technology is much superior in understanding human intent, language, language, gesture, voice. So that's a big technology change."

Bixby is, for Samsung, a service that's been primarily concerned with device control, and while it may not be as well known as Siri or Alexa, if you enable it on your Galaxy phone you might be surprised at all the things it can do for you. Even so, I wondered if there's still a place for Bixby alongside the more visible and highly-touted Galaxy AI.

"It's a service that is used by a lot of our customers, but not all. And we will actually continue to evolve Bixby and to improve it with this LLM technology and others, to make the kind of experience across Samsung devices magical," Chomet revealed.

Put another way, Bixby is set to get a lot smarter, though no timeline has been set for this. Even so, Bixby will remain fundamentally different, if not separate, from Galaxy AI.

"I'm trying to [make a distinction] between the fabric of the device, which is intelligent, and then we will run all kinds of services on the device, from Bixby to music to other things, which all will have intelligence as well," Chomet explained.

Samsung Galaxy Z Fold 6 HANDS ON

Samsung Galaxy Z Fold 6 Circle to Search (Image credit: Future)

Building the AI ecosystem

That fabric will cut across multiple Galaxy devices, adding an intelligence that will ultimately recognize context and intention, and it's a strategy that could be supercharged by the Galaxy ecosystem. Samsung has been pressing the idea of this ecosystem for years, but because the company does not have full-stack control like Apple (from devices down through the platforms and to the silicon), it's not always apparent in the same way.

"We see a healthy development to the Galaxy ecosystem. We still have a lot to do, that’s fair," Chomet admitted, but he also detailed how owning more Galaxy devices, especially with the addition of Galaxy AI, will offer new benefits. "The keyword is convenience," he said. 

Chomet pointed out how pairing the new Galaxy Buds 3 Pro with a Galaxy Z Fold 6 works with a single click, and that the buds know to deliver calls or read emails or notifications from the phone on the buds. The combo of devices makes them work as one, he told me; "It’s not science fiction, by the way, this is real."

Throughout our conversation, Chomet described a fundamental shift in how Samsung views platforms.

"What we say is the UI is AI. AI is the UI. The user interface becomes intelligent... each and every service, ours or third-party, is powered by AI and includes generative AI. So that will continue. That is kind of a revolution, but not the deepest, from my point of view. The deepest revolution for us is the UX; the user interface being intelligent, starting with gesture, text, voice, audio, and so on. So that's the user interface. The AI challenge is one that we are very, very deeply involved in, and of course, also plays at OS level with our partnerships."

Samsung Galaxy Z Fold 6 HANDS ON

Samsung Galaxy Z Fold 6 sketch to image (Image credit: Future)

I've already started playing with some of the Galaxy AI features on multiple devices, and one thing I noticed is that they're not all easily discoverable. Some, like conversation translation, reside under Settings. Chomet didn't disagree.

"So, my kind of constant battle is that we have so many things and they're like, how would the customer know that? Oh, you have to go into the Settings menu... but nobody goes there, especially not anybody in my family ever uses that menu. So the beauty would be that you don't. The goal is you should never have to go into the Settings menu," he says, before adding cryptically, "which we will never achieve, but that's our goal."

Samsung has made improvements on that front. I noticed for instance that one of the Galaxy AI features first introduced with the S24 line, the ability to add slow-motion on videos shot at normal speed, has been improved on the Galaxy Z Fold 6, with Samsung making it easier for users to save those clips. Chomet told me that generative AI photo-editing suggestions are another change on the front. "Now we suggest instead of making you dig for them."

Samsung Galaxy Z Fold 6 HANDS ON

(Image credit: Future / Lance Ulanoff)

The special partnership

Where Samsung seems to lack consistency is in how it implements these various Galaxy AI features. Perhaps that's because it relies so heavily on partnerships, especially with Google, with which Chomet says his company has a "very special" partnership. I was curious about how Chomet viewed Circle to Search, a feature that Google appears to claim as its own. Chomet was less certain. "I don’t know who invented it," he told me. "Search is Google, but the physical interface is a joint effort."

The origin story is a bit fuzzier than I expected. Chomet told me that Samsung met with Google a couple of years ago, and was telling Google about new on-screen gestures. "You have to integrate hardware and the software to make it work, right?" he said. "So, I don't know who invented that, but actually, we worked with them, so eventually it's powered by Google search."

As Chomet sees it, sometimes Samsung's needs are the mother of Google's invention. He gestured to the Samsung Galaxy Z Fold 6 sitting in front of us. "We launched the foldable first. So we have something here called screen continuity, which is an Android function. So if you have a YouTube video on your main screen here, and you open the foldable, it will continue automatically, seamlessly. So that function, you will say, 'Well, it's an Android function for sure,' but this function wouldn't exist if I hadn't asked for it, because I need it for foldable."

The deepest revolution for us is the UX, the user interface being intelligent, starting with gesture, text, voice, audio, and so on

Like Apple, Samsung's AI approach is hybrid, both on-device and off. However, access to off-device services like Circle to Search, and some generative features like Sketch to Image, still happens off-device. 

The preference, Chomet told me, is to run things on-device for the low latency and privacy, but there's more to it than that. 

"You don't need connectivity, so it's just faster," he explained. "So performance, if we do it on device, we have more context of the user data on device, which is private. So we can keep all the user context or the user data on-device and personalize the experience like next action, configuration, what should it be, based on your context which is private and secure on device. So the more we keep it on device, the more we can do things which are personalized yet private. So performance, personalization, and privacy […] that's our direction."

In the meantime, though, Chomet told me that users will be able to control whether or not they want any of these generative AI systems to share their data with the cloud.

As the transition continues from hybrid to, maybe someday, all-on-device, Samsung is working to deliver as much of Galaxy AI as possible to its user base of one billion people. 

"The only limitation is quality," he said. "We go as fast as we can go with the right level of quality to deliver to the end user, and it’s actually just beginning."

You might also like

Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Ryan, the Today Show, Good Morning America, CNBC, CNN, and the BBC.