DELIVERED

Hello AI, goodbye UX as we know it with Constantine Gavrykov

Infinum Season 1 Episode 16

In this episode of Delivered, you can learn how to successfully navigate the future of UX and product design as AI agents take on the role of the user.

We sat down with Constantine Gavrykov, a design leader with 20+ years of experience in product and UX design. As a Senior Director of Product Design at Decathlon, he focuses on creating product and user experiences that turn heads, drop jaws, and deliver impactful business results. With previous experience in gaming and e-commerce, Constantine brings a unique perspective on what’s coming next, from AI agents to brain-computer interfaces (BCI).


Key learnings:

  • Find out how AI is transforming UX and product design
  • Learn how to design interfaces that serve both humans and AI agents
  • Explore why personalization-first is replacing the mobile-first paradigm
  • Understand why AI literacy is now a core competency for design leaders 
  • Discover the top three trends that will shape the next decade of UX

Thanks for tuning in! If you enjoyed this episode, hit that follow button so you don’t miss our next delivery.

Have feedback or want to recommend a future guest? Drop us a message!


Delivered newsletter
Delivered episodes are recorded live. Subscribe to our newsletter to stay informed about upcoming live events. When you join a live event, you can ask our guests questions and get instant answers, access exclusive content, and participate in giveaways.

About Infinum
Delivered is brought to you by a leading digital product agency, Infinum. We've been consulting, workshopping, and delivering advanced digital solutions since 2005.

Let's stay connected!
LinkedIn
YouTube

What's up Constantine? 

Hey, all good here. All good. Hi everybody. Thanks a lot for having me. And finally enough, I'm actually comfortably sitting here in Georgios's studio ready to have an amazing chat and looking forward to this. And I have to say kudos to the team. They have prepared everything months in advance. We had this discussions and everything was outlined to the minute, and yet we still have of course, the hiccup at the very last moment. 

You can plan for the worst and hope for the best. That's what it's about. I mean, that's about delivering things, right? So I think given my situation, we should get into it and let you talk as much as possible. So for those people that are not yet familiar with you and your work, can you tell us, give us the introduction to Constantine, where you started your career and how you ended up in the position that you are in now? 

Sure. Let me try and give you some kind of milestones because I've been in design and my career been quite a long one already for probably more than 20, 23 years. But from the very beginning, somehow I was really interested in computer science combined with doodling in my notebooks at school. And somehow I knew that I want to be a designer. But then when the internet properly arrived, that was an eyeopening moment for me when I said to myself, I want my life to be about it. So another highlight was I'm originally from Ukraine, but I've been living in Amsterdam for the last 19 years was when I moved over and I had a chance to build a team for Tommy Hilfiger to help them work on their Tommy Hilfiger.com. And that was a story for a couple of years. Why I mentioned this particular one, that was the moment when as a designer I met as well my passion, which is e-commerce. 

And that's an amazing journey because it always starts way before you go to any dot coms or the apps and it finishes with a real product in your hands on your might be the garment or a footwear and so on. And I love it so much and the complexity of it that I stuck around in the same industry. So during the times of Tommy, I helped as well with rolling out some mobile apps of Calvin Klan and then I moved to Fast fashion, which is bestseller. And the last couple of years or couple of years a while already, I shifted to sports and I've been with Intersport, Adidas and right now decathlon. For those that do not know where it is, it's actually the largest sports goods retailer in the world. 

I know that's pretty, I just learned about that a couple of weeks ago. That's incredible. 

But besides all of that, I'm the tech enthusiast and I'm trying to stay on top of the things in terms of the emerging trends and spending quite a lot of my time trying to connect the dots, look at the emerging trends and what is happening in our industry and translating into the division. So what the world will look like for our brands, but also for ourselves like a couple of years from now or as far as we can see in terms of the innovation horizon as well. 

Thank you for the intro, Constantine. So the reason I wanted to bring you on to delivered is because you wrote a series of great articles, three critical shifts, redefining digital industries, the rise of autonomous agents, hyper-personalization and genUI, and also natural user interfaces, which is, it touches on a lot of things that I've been thinking about and also what we come across speaking to clients as well. But I wanted to ask what compelled you to write this, because it must have been quite a lot of work to get into. How did it come about? 

Thanks, Anna. I appreciate that you found the articles actually interesting. Frankly, it's been boiling in my head for quite a while and right now we all talk about AI and we talk about the practical application of it. But even a half a year ago I was feeling quite lonely in my head. You would have a little bubble where you can talk to the likes of yourself for instance. But we're still trying out different tools and figuring out, hey, how do we bring this news to the rest of our companies, to our teams as well? What will my future look like? 

I had an amazing happening to me earlier this year. My son was born and I was on paternity leave. So for months my hands were full thanks. But my brain was not challenged with my day-to-day stuff at work. And this is exactly when during all those sleepless nights, these articles came to fruition. I started to try answering the questions that I had and put it on paper that also helps me with anxiety. Oh yeah, what will my role look like? What my team is going to be doing a couple of years from now. And this is exactly the result you could see right now in this articles. 

Ah, that's very cool. Okay, so that's a very productive use of the Sleepless baby nights. Well done. So the articles I want to get into a little bit, we're going to talk about a lot of the topics in there 

And I think one thing you talk about is mobile first to design and how that has been the gold standard for a long time. But as we move into now this next generation of the web, is mobile first design still relevant? 

Frankly, it's a good question and I'm contemplating between giving you a me scientist version of Hey, this and that is going to happen within, that's pretty or a bit more grounded version, frankly. Okay, let's go with the me scientist version. We are right now talking about mobile first, which implies that we will actually have the mobile still in our hands and we'll have the.com, so something like on the intranet or the apps to go to. But in reality, with the introduction of GenAI and agents, the entire paradigm is shifting. So sooner or later, and we'll have a transition period as well. We'll arrive to a different destination, to a different destination in terms of how do we interact with brands, how do we interact with information, how are we as well solving the needs and the pains that we have as customers or as users or as just human beings? And the shift has already started dramatically with the introduction of LLMs and we're chatting, which is already way more natural than going for interaction with the graphical user interfaces for instance. But in reality the response is not there yet. So right now we're just making horses faster and we're trying to figure out how to be there as brands for Francisco when the bots are visiting. 

But in reality, the biggest shift from mobile first to more personalized-first is going to happen when two things are going to fall into place. One, the adoption is going to be much higher on the level of the use of GEI as personal assistance. And with that adoption, the assistants are going to be trained much more on who you are Georges or who I am for instance. So what I ate yesterday or when was my last vacation trip or what kind of training do I prefer with that information, they would be equipped to tailor the experience much more to me. And this is where that hyper-personalization is going to kick in. But the other pillar for this to happen is that the data should be available and ideally should be available in a more systematic way than what we have right now. And by this data, are you 

Referring the user data then so that people are sharing or what type of data is it that you're referring 

To? Here? I'm referring to the brand, the companies data, the data that's available for us to explore right now, we are consuming it either through the APIs, but this is not human way or through the interfaces. But in reality, there is no need for us to be going clicking on different buttons and figuring out what the screen to go for and what to do at this particular screen if I just need to have an answer to my question or for I need guidance or I need to even purchase something. 

So you talk about personalization and that this leads to if users are willing to I guess share some of their personal data, they can get hyper-personalization in return so to speak. What's an example of that? If I would go and order shoes in a hyper-personalized store, what would that look like? 

I would say there will be still tears and you don't need to give up everything about yourself if you don't want to. So tier one would be personalization based on for instance, your actions. So ai, the way it works right now, it's all about the pattern recognition. So how our experience with the website or the app is different. So after a couple of clicks you kind of convey a pattern and AI recognizing and predicting what will be the step four, step five and so on. So this is the bottom layer of personalization and can try to respond not even knowing who you are, much higher. One would be you logged in, we have your purchasing history for instance, we know something about you and we try to adjust the entire flow already to you. So for instance, we know that you have been purchasing for quite a while running shoes as you mentioned, and we know that you are an avid runner and you understand what is your style, what are you going to be using it for, like a marathon or just a casual run or something like that. 

And we can save you a lot of time, but just giving you the right products in front of your face and also giving you the right information about these products. But the hyper-personalization is going to hit for instance, if you indeed can open up yourself to your AI agent and let it shop for you, it would not only this particular thing, but it would know your nutrition plan, your training plan, your next marathon that you want to attend, where's it going to be and so on. And based on that, it might not even give you the shoe, it might give you an entire package of services. Maybe you have a recurring kind of nutrition, for instance, supplements arriving as well as the training plan or a community to work out with for running. 

Ah, very cool. Yeah, so how I've been thinking about hyper-personalization as brand experience creator, I've been also thinking about you might be able to even tailor the content to the specific user. Oh, we know that Constantine likes running on the beach so that it will provide the look book information, all the look book content and content is really centered on shoes on the beach. I'm painting a very literal picture here, but do you think that there is any, so the debate I've been having with that with other people is that, is there a business case there? Do we have any confidence in that hyper-personalized tone of voice content, image content, video content, particularly to what we know about the user? Has there been any examples of that or do you think that brands could have any confidence that that would be good for business? 

Listen, that's an amazing question and for me the matter is not even if there is a business case, it's more of like when is that going to be rolled out on mass? And this is already happening right now and it's been happening for a while. It just that it was not hyper personalized. Brands would put their customers into cohorts, develop persona and serve particular content for one persona versus another for instance. But it was, let's put it this way, mechanical, you identify this cohort, you hope this is exactly them, but you still generalize and somebody responds better to a video, somebody responds better to the wall of text and somebody would respond better to chatting to a shopping assistant like live one or not right now with GenAI, the nature of it that it can generate stuff on the fly specifically for you. So to what you're saying this is if I respond much better to something like that is happening in my environment or most likely as well with the engagement of my friends or something like that, let's use an example. 

The models in the try room when the virtual try-on problem. 

So if you already know my body type and you make sure that all the product information is put on a model, that is my measurements, I think we could also increase the confidence that likeer to convert. 

That's an amazing example. This is exactly where it's also useful. Why because size and fit is one of the most notorious impediments that internet has never been able to solve just yet. The real store experience for instance, and the field always beats the virtual one, but we've been trying for quite a while and with the computer vision combined with GenAI, this is exactly where there will be a massive unlock. 

Yes, I totally agree. So that's kind of like hyper-personalization with towards human users, but you also write about the rise of AI agents, shopping agents and users, agents that will go out on the behalf of the user to visit websites, apps, e-commerce platforms, and do things on their users' behalf so to speak. What you write a lot about the UX and product design implications of that. Can you talk a little bit about that? 

So to start from, I'm pretty sure that everybody already knows what agents are, how it came to be, but not everybody has a proper experience of utilizing them just yet. So for me, I had this memory of any UVA on South by Southwest, the futurist giving an example of what actually the agents are going to be. So currently L Lamb's large language models is like imagine a genius sitting in the room without the windows and only the door and you are sliding the genius, your request on a piece of paper under the door. Then she looks at it figuring out what she can do about that and slides something back to you. So with the adjuncts, we're stepping in much more into the LAS large action models and this is imagine that genius stepping out of that room. There are no more restrictions that she can perceive the world see things or not only seeking whatever available data inputs over there would be the source of the information foundation and can act on your behalf depending on what it is. So to give a couple of examples over here, the prominent ones while OpenAI with Cha g pt, they released earlier this year, the operator, but that's the computer using agent. So it browse stuff and it can book you, I dunno, a hotel can fill in the tax form or go shopping for you, but it's purely focusing on the vap, right? What's more exciting 

And it's basically mimicking a human user. It goes out and looks at the page and then clicks it around simulating mouse clicks or navigating the keyboard and typing in things. 

Hence yeah, it's a really good thing to mention, hence a computer using agents. So for now it's like we got the faster horses, you don't need to do that, but because we don't have yet the build trust for instance, and the adoption is going much slower, we need to go through that staff to allow the agent to do it for you. But you are still there present as an observer and the most sensitive moments as well as human loop design, it gives the control back to you, Hey, is that the thing that you actually wanted to purchase or the payment information, please fill it in. But quite soon I think convenience is going to win over and we will be quite happily giving that control away. 

Yeah, I mean I was watching, I think it was at South by Southwest, that was the CEO of signal talking about a super smart woman, I'm blanking on the name, but she was talking about that this way of computer using agents 

Is like a privacy nightmare because they need to look at things, process that information, they look at the phone numbers, addresses of all your friends, credit card information, bank details, bank accounts, then send that over to Silicon Valley for processing and then return and do the actions. So I think my sense is that there will be a pretty big barrier of adoption there because of privacy concerns. But I think to your point, convenience will eventually win. And I think there's conversation that open AI is in touch with Shopify and if they would plug in the APIs directly so they can transfer anonymous information, but you still have user data, I think then we're going to see a much faster adoption of these things. 

And I like that we're still at the moment when the future is not defined yet, it can go in multiple different directions. And for me the interesting happening was that the last months, I guess Visa and MasterCard suddenly showed up on a playground and said, Hey guys, by the way, we're going to help out with the agents and we're going to help the agents make payments for instance. And I was like, hmm, I didn't see that coming. So suddenly that part, they are now leading the way and through leading of course because they are already having a massive user base, they're going to be dominating the market. But the other example that I wanted to bring before about the large action models is not only computer using ones, but like Alexa plus this is your Amazon agent that is not only focusing on the va, but it's focusing on all your home appliances that are IOT enabled. So you have a door camera, you have the microwave, you have the lock and this kind of stuff. So based on the need, they can open the door or they can heat up the pizza or they can order and reorder the grocery depending on what you have in your fridge. So tying it into a more interesting topic is that with this large action models being able to perceive information, they will eventually penetrate the data even without your consent. So to give you an example, last week meta announced the glasses, right? The 

Glasses, yeah, the glasses, yes. 

This time they're playing it for hey, that's going to help you exercise better, be healthier, and in general sitting with that factor. But what they're asking in return is that they're always on and you are broadcasting basically to the ai, everything that you see. So before with the internet, the AI models were trained on the textual data, images, videos and this kind of stuff Here. Meta is going to be training its own AI on the real world life data inputs that you actually willingly as well let it consume. 

Yeah, so it's basically surveillance packaged up as fashionable accessory. 

I mean if you wouldn't know what Facebook is or meta is, I would be more optimistic. But we've been there, we know what is going to happen to our data is going to be monetized. I don't have any hopes for that part, but it's still a pretty smart move. How can you at scale acquire a source of so much information? 

So you mentioned Alexa plus, which makes total sense because that's within Amazon's ecosystem. So they already have your consents to all information and it feels like a short step for them to just go out and take actions on behalf like, oh, it was 30 days since the dog food order was made, so I'm just going to go out and do that for them. But I mean that you already have those subscription type services, but I think those things would go and be much more dynamic now. 

So it's going to be automated, and this is why I'm talking about convenience for the first couple of times, for the first maybe couple of months, I don't think it's going to even take years for people to try out. We will need to have that AI holding you by your hand and saying, Hey, this is exactly what I'm doing. But as soon as it's going to hit a mass adoption, we will give it away and just say, Hey, I want my cats fed and I have the automatic fitter. So whenever the food goes to a lower levels like reorder automatically, and this is even right now, detailed request, precision request, quite soon I guess we will be less in control willingly just saying, Hey, make what needs to be done or do what needs to be done. 

Yes. So for designing these type of experiences, do you have any best practices or advice for people that are about to design interfaces and systems that should work equally as good for humans as in ais? 

Listen, it's a really big topic and frankly I will share a couple of, we've got plenty of time thoughts, but on the other hand as well, nobody's an expert yet. So this is what I allow about the time we live in. We're just emerging with that and through trial and error we're going to get there, but there are some kind of fundamentals. So while the agenda is still there and the human is still there, the main advice is analyze at what points, what touch points exactly, you're going to hit one or another and design for the dual interactions, meaning we still need to sustain good user experience for the real human being, clicking through, tapping through, scrolling through. But as well we need to prepare ourselves that it might be that part of your traffic actually is the agent and with the agent as well. 

Therefore, what does it mean designing for dual interactions? Agent doesn't see pages or apps or interfaces like humans do. They actually read and rely on a dome structure for instance, and therefore what is going to be emerging is agent responsive design. But once again, you can make a decision, is that the time for us to engage in this approach or is it better to wait until we completely eliminate the need for the interfaces? And another part is then human and loop design. What I mentioned already, this is the part where at the most sensitive moments we give human control back. And this is about building the trust, having the oversight and making sure for instance, that you are completely in the know of what was being done over there. Maybe last but not least to mention or here is that avoiding the AI pitfalls because they don't see the interfaces, but they need to make a decision as well on when to and how to engage with that and what particular product to show to you. 

They will be making choices towards the products and services that work faster, that deliver much more transparent information that give you a complete and full control. So first, to give you an example, if you have infinite scroll, you're going to be deprioritized. It's bad if you have models or pop-ups, this is going to be standing in the way of that AI agent to giving the result back to me as a customer. So therefore, instead of jumping through the hoops and loops that real users would do right now, and even those are abandoned, the websites AI agents won't do that. So we really need to make sure as well that we built the products with that in mind. And eventually, yeah, go ahead. 

I can see you now, but I'm going to join the other room. So I was curious what you said about infinite scroll, why would it not pick up or is the alternative that we will go back to good old fashioned pagination? 

So frankly, this is about the page performance for instance, and this is about the interaction models that are going to be easier and simpler for AI to consume. So infinite scroll doesn't bring them anywhere. They can much better indeed, as you say, scan all the pages, but in reality that's yet again, that's a step in between. The next step would be AI agent doesn't go to the bedside for you. AI agent just taps into the raw data that we prepared for them through whatever means you want and gets the right information, collect without clicking, without tapping, without going between the pages and so on because they don't need it. Graphical user interfaces were created for us to interact with the computer systems. So therefore it's actually going to be a thing of the past and that's going to be happening. And this is what's happening already right now with brands figuring out how can they be and what can they provide much faster to. Yeah, a good example of that, I don't know if you visited Google recently. Yeah, it still gives you the pages, right? 

Yeah. 

But it gives you the answer in the summary right there at the very top. 

Yeah, I've used it for a long time and I think since they rolled it out, I'm noticing that I'm probably reduced my click-through rate to pages with 80 90%. And I think mean there, I saw a news headline about that that was publications like Huffington to Post and Business Insider has already seen 40 to 60% reduction in organic traffic because they're, since Google rolled out the AI overviews and also of course that ChatGPT now can browse the web and handle those things. And I think that is something that many companies are grappling on and I think we also have that we get clients coming to us and asking us like, Hey, why is our traffic down? 

And 

It's not something that's, it's about to happen. It's happening right now. So I think you're working for a large company that I'm sure it relied to a significant part to some 

Organic 

Traffic. How do you guys think about that? 

So I'm not in a position to share the exact development, but let me share something more overarching. The main two, three things that we already landed on is, one, we need to have AI literacy in place and therefore we need to upskill our people and ourselves to be able to make decisions when we're presented with the right questions, for instance. And the question is like, Hey, how is it going to affect our tomorrow? How is it going to affect the customer shopping habits and behavior? How is that going to affect logistics and value chain for a second? This kind of stuff. But until we get a better understanding of what we're dealing with, it's very hard to make the calls. So the focus here goes on literacy and basically leveling up in terms of the understanding what we're facing. The second part is having the vision in place. 

And this is a bit more difficult because the main pitfall, what everybody went for is like, Hey, here's ai. Let's grab it and apply to whatever problem I have over here. While in reality what we are trying, and especially from a design perspective to look at is hey, but what particular things we're facing over here? What customer pains, customer gains, or maybe internal impediments do we have? What are the highest protocol? What are going to be the highest impact? And can that be solved with the application of ai but only as a secondary measure? And I think this is quite important not to fall for this, hey, click, let's automate, let's become more efficient. For instance, there was this memes that the company would spend four months, 25 people a automating something that actually takes two hours to do per quarter. So it's be for obsolete. 

So let's see. I'm just going to see, we have a bunch of questions. I'm going to also see if there's, we have got some questions from the audience, but before that I wanted to talk a little bit, you mentioned vision. I think it's important for business leaders right now to have, I think I was this morning in Paris and gave a talk in front of us, a group of business leaders. I'm talking about what are the leadership characteristics that will endure in the age of generative AI or age of AI agents or ai blank whatever, fill in the blank. You've been talking about this. So maybe we start there. What's the role for the design leaders, if we start there? What do you see in this post or post AI or AI era? What's the role of the design leaders and what leadership skills or do you think are important? 

I'm kind of struggling as well to put back, not back into the box, but just in the box as in just design. It feels that right now we're also on the verge of when the new roles and the new possibilities are going to be available for us. And one of those things is how we change our perspective. And what yesterday was a prerogative of design, for instance, leader today suddenly is available and enabled for engineering or product and so on. So if you don't mind, I would just go for something more generic. And this is exactly where we have a lot of conversations and I came across this week a really nice playbook by B, c, G on how corporations can 

Consulting group you mean? 

Exactly. Yeah. Can prepare for the GenAI era. And a couple of points really stuck with me. One is make a business lab AI agenda. So from that perspective, once again, not just applying the tool where it needs to but focus on tangible business outcomes and prioritize the value versus experiments or pilots and so on. And here. And what's also quite interesting to see the shift, don't centralize everything, let people, business units, different domains and so on, experiment and lead. While of course your IT and your AI team can scale, the foundations can bring in the governance and so on. The other advice, and this is absolutely I think a must as a leader to give people time and space to embrace AI in your daily work. 

Yeah, we actually, over here at Your Majesty, we actually started now on giving people the time and the freedom on one day a week to experiment with AI tools. And I mean basically use 20% of your time to become 40, 50, 60, a hundred percent more efficient or productive in an unexpected way. So they get the allocated time for it. And also a budget to buy and try out new tools so that they can really sit down and think about this. Because I mean, you and I are in this field and I'm sure you feel as overwhelmed as me at times because it feels like just when I figure something out, oh, now I know how to prompt them. Absolutely a new way. So it's moving really fast. So I think it's important that leaders, if you want this to be adapted and adopted in your organization, you need to create space for that. 

Talking about that as leaders, we're quite often presented with not the most comfortable questions. And what you're talking over here is how actually we as leaders anticipate the impact on our workforce, on people, not only on business and companies, how the roles will change or shift, what is still going to exist, what is going to change, and then therefore how can we come across and how can we upskill accordingly as well? And maybe right now what I noticed as well that a lot of people are trying to put it away, put it aside, like, hey, it's just going to naturally happen, or I will tackle it later. But in reality, I would advise maybe already stepping into it and mapping which functions, which parts of your jobs are going to be automated and which ones are going to be redefined as well. Having that understanding and not shying away from that, probably going to be much more appreciated by the people that you lead than just trying to protect them from ai. 

Oh, so you mean that you also give them some ownership of dictating what parts of their job would be that? Or maybe I misunderstood? 

Yeah, no, it's a good correction. I would say I don't think that any leader has an ability to define it themselves. So of course you need experts, and this is exactly where people looking at what they do and analyzing it would help map it and would help analyze actually what's going to be changing. But the first step still needs to happen before they need to have a good base. They need to go through, hey, let's start with prompt engineering, but let's also understand what LLMs are, how they work, what is, what is single model, how are we working with the agents, how are we working with multi-agent systems? What is going to be the product? Just to give them the base to then reflect on, hey, what am I doing? Do I need to optimize this workflow? Or maybe there is a completely different way to do it. 

So we're going to move on a little bit, man, it feels like we could talk about this forever, but we have a script, so I'm going to stick a little, steer us back. So which tech trends, you also write about that the tech, because agentic AI and hypers is part of it, but which other tech trends is it that you're seeing defining the future of UX and what challenges currently stand in their way? 

That's an interesting one as well. Without wearing a hat of the mad scientist, still going to be 

Difficult to discuss mad hand, you're free to be as mad as you want. 

Let's go for it then. Yeah, biotech, personal medicine, quantum computing, robotics. Robotics is actually not that mad. And I think this is going to be hitting us way faster. And in reality as well, I bat and I hope that it's going to happen much faster. In general, the hybrid experiences, so we were talking plenty before about AR and VR, but right now it's going to be happening and especially the biggest leap that I anticipate is going to come from BCI. So brain computer interfaces that are going to completely revolutionize the way we interact with the systems or with the world as well. 

So I mean Elon says that we're there right around the corner. Do you believe that? What's the hurdle standing in between? 

That's a good one. Elon says a lot of things and he needs to make sure that the NewLink is solved. But are we there? Not yet. Are there pilots and different kind of experiences? Yes, for sure. So the main thing about the NEURALINK is that it's intrusive, it's invasive. So you need to have basically a device attached 

To implanted in your brain. 

So receiver and then the transmitter and this whole story, not only there are plenty of developments of such kind are happening, but when it comes to a human body, it has a much higher threshold for you to go for that. That's basically an implant. And how do you know that the one that you install today is actually going to serve you until the end of your days? So ideally the breakthrough is going to happen when we'll have non-invasive brain computer interfaces. 

So 

Devices that can be headsets or something like that is much more human friendly. And 

I mean, it's not that farfetched to see that that's going to be how we might end up. I mean there is companies already that are using pattern recognition and AI models to decode animal speech for instance. And if you can decode animal speech, you could probably decode brainwave soon. So having some sensor that measures brain activity and being able to translate that into different actions or handle less input output does not feel very farfetched. And I would much rather wear a cap that's hooked up to my BCI rather than having holes drilled in my head. 

Agreed there. Yeah, though I would say if there would be proper implants available, I would go for it. A rubber hand or an eye or something like that. Yeah, for sure. I would be the first one to adopt. Yeah, 

I don't know. Yeah, maybe. Yeah, I would be curious for sure. So the production team says that we have a bunch of viewer questions. That's awesome. Thanks for sending in your questions everyone. So let's go into those. First question is from Katie. She writes, can we anticipate a future where eventually branded experiences slash branded UI will become obsolete if these machines don't need to see a brand's visual identity to execute functions like this for people, web experiences becoming more like content databases for AI agents? That's an awesome question. Thank you, Kate. 

Yeah, amazing question. I wish I had an answer, but we can chat about that. And Georgios, please jump in as well because, 

So I think I have a take on this. I was chatting this morning, we talked because anecdote, my dryer broke down. Our dryer broke down for a while ago, and then my wife went out to look for a new one and it was the first time she used ChatGPT to do the research, the finding, the type of dryer she wanted. She wanted to have a specific below a specific noise level. So basically the chart GPT went out, served up 10, and then compare them, contrast them with price and those kind of things, and then eventually serve up the buy now link. And the only thing that she did was to go to the PDP page of the website that was selling it and made the purchase super exciting. And it was like she got a personalized shopping assistant that just talked through the whole experience. 

There was no branding from a ball.com or a DAYLON or whatever. It could be in between. It was like unbiased product recommendation. And I think there as someone who designs brand experiences, yeah, that feels like I got side stepped because oh, they didn't see my beautiful landing page or they didn't go through my super slick PDP that has all these transitions and those kind of things. And I think a significant part of traffic will come to there, but I think the opportunity for people in branding and expression and ui, and I think that will come through other things that you will tell brand stories outside of website, how does the brand come to life? How do you show up for the first time? How do you make a good first impression wherever the view might come from if it's on social media or on TV or a billboard in the real world. 

I think that will actually outside Billboard real world branding, we're going to see more of, I think we're going to see more of brand activations in the physical space that capture someone's attention in the real world. And that's where you start building an experience. And then I think it's going to be important for all the digital touch points to speak in that language, like using Deck capital on an example. But if Decathlon was a person, how would it speak? How would it look like? How would it move? How would it greet its visitors to the store? I think we're going to have all these, it's almost become the era of branded agents, which is almost like an amplification of brand ambassador. Like George Clooney is the brand ambassador for Nespresso, but what is the branded agent of Nespresso? Is that George Clooney's voice? Do they buy his likeness and his voice and his mannerisms? I think that's going to be a wild and new exciting opportunity for a lot of companies. Next question does not say who it comes from, but how does AI impact accessibility for people, particularly in e-commerce and online purchases, for example, for users with impaired vision? Are the relevant use cases or do you think we're going to have to stick with the classic accessibility plugin approach issue might be how much AI does for accessibility cost? 

Frankly, this one is actually much easier because if anything in the current state, the AI with the interfaces like that we chat to kind or speak to this is already a massive leap to become more accessible. Why? Because graphical user interface requires from you, first of all an understanding second of all as well means of interacting with that doesn't matter if it's a trackpad or a hand seeing the interface and so on, while gene AI is just basically eliminating that because you are interacting with it in a much more natural way. So you can type and of course it's an acquired skill, but pretty much like every human being knows how to speak. So you can do that or you can speak to it indeed with a large action models. With the computer vision, this is my favorite part, and this is not yet fully enabled in Europe, but it's coming. 

I was trying two weeks ago when I was in, you can just show it the stuff. Even though already right now you can upload photos connected to ChatGPT, but in the US and so on, you can just point at stuff and say, Hey, what it is even or tell me how does it work or tell me what's wrong with that and so on. So from that perspective, it is a massive enabler and this is going to tailor two different things like you have or vision impairment or some kind of other disabilities. And there would be ways of course to overtake it unless of course not unless, but BCI is going to be the next leap actually. 

Yeah, no, I think you're absolutely right. And I mean we already, I already use style transfer for things that I don't want to reach large reports. I went through the McKinsey AI report and I was like, oh my goodness, it's like 50 pages. I'm not going to read through this. So then I put it in notebook lamb, it generates the podcast episode and I write, so like, okay, go through all of this information, serve me the bits that's relevant for businesses like mine, and it'll transform that into a conversation with lovely hosts and they will talk to. And I think it's easy to do that parallel to the accessibility world because it'll be able to speak to the user in the way that the user best takes in information. I think last question, we will have time for this come from Mihaela. Do you think that there will be analog design and AI design and options for users to choose which level of AI they are comfortable with? Difficult question. 

Yeah. Let me unpack it from the last sentiment, the choice. We as designers and as leaders and as engineers and as entrepreneurs as well, this is on us. It is about ethics and it is about how we'll shape this world. And I would say that yes, we should absolutely focus on that and not give it away. And it doesn't matter if it's the interface or the next big thing that is already arriving over here is the deep fakes. So it is on us to demand from the countries like the regulations for instance, to be in place and ourselves as well, not to fall for it so that we could clearly see what is AI generated, what is AI tweaked versus for instance, the original content. Is it going to be the same for the interfaces? I can't tell because this is as well on the side of who owns the interface, the brand, and it's for them to make that decision. But most likely what you can do and you will be able to do is you can still ask your cha G, pt, Gemini or Grok to give you information in whatever way you want and you can just ask it, Hey, give it to me as a classical website that I'm used to. I know all the conventions and so on. If you have a nostalgic filling for browsing the websites, for instance, instead of just getting directly to the answer that you're looking for. 

Very cool. Constantine, that's already one hour and during that hour I even transported a couple of kilometers in a taxi. Hopefully there was not too much to ask from you. Everybody who tuned in, it's time to wrap up. Thank you everybody. Thank you, Constantine. This was awesome. I think we are going to have to bring you back to talk about this again. Maybe that time we can do it both in the car, so it's like a carpool karaoke, but in self-driving ones. 

Yes, let's do that. So thanks to you Constantine, and thanks to everybody here who tuned in and listen with us and for all your smart questions.