{ "hq": [ { "speaker": 0, "text": "Come up. Yes. Alright. So, Chris, keep through the presentation so that I can get to the demos, which are not risky risky at all to give a demo in this time. Alright. So, I already have myself, Guillermo, CEO of Purcell. I think we all agree we're at the AGI house that software will change very, very dramatically in the coming years. So we're all in the right place as developers and end users of AI software. I like some of the conversations we just had around embracing the non determinism of LLMs. In And the way that I explained it is we're going from computers that reliably say that 2 plus 2 equals 4 to computers of the software 2.0 stack that hopefully say that 2 plus 2 is 4, but it's kind of promoted as it's 2 by 7.5. And our job as software 2.0 engineers is to deliver this new applications that actually make reliable use of AI and productive use of AI. And what's been happening in this transition from software 1.0 to software 2 point o, I I probably describe it as a shift from ML to AI. And what's exciting for folks that are in the Next. Js and Brazil ecosystem, we're all kind of already drinking from the Kool Aid of JavaScript and TypeScript, but I cannot mention it as going from back end first to front end first. We're going from a world where we just talked about only a few companies being able to train large language models to these models being broadly accessible to everybody in the world. And what's really exciting is that we're going from a TAM of 1,000,000 developers can create this next generation AI systems to everybody in the world can create AI systems. And I think a lot of the enterprise demand is coming from the fact that enterprises feel it's the right time for them to be able to participate even though they might be in traditional businesses where ML was sort of unapproachable. And one bold claim that I'll make, of course, I'm biased, is that, the AI engineer of the future is what of us? We write JavaScript types, script, read, code, not biased at all. But there is something to be said about the generations of AI applications and end user experiences that have come to market. And not too long ago, when you would go to chat with your team, you would say you would ask something about the current time or state of the universe, and you would get sort of this awesome non answer about, I'm a large language model, and I don't know anything. Then the introduction of tools, I think, is really remarkable. I still think they're underrated by the developer community. But with tools, now we can query oracles, to get data from other systems. So we can kinda deliver on the working systems. In fact, this is how we solve 2 plus 2 equals 4 because we can say if it's a math problem, rephrase it as a code problem or try not try not doing it to the point around some narrow enterprise use cases are not supposed to do that. And what's exciting for us is that we're entering this world of generated UI. We can now start bringing back some cool elements of software 1.0, which is to make user interfaces that people love. So I'll go through an example here, of AI airways, and this is the kind of example that our technology will power. It's not a 7 37 max. It's we're looking for actually, the end user wants to say, well, I don't wanna buy a set of years of Mac, so what do you have? And instead of just giving you text, we give you UI. So I think this is gonna create better experiences, and then the chatbot might say, well, what seat do you wanna change your seat? Well, once again, the answer there is not to bring in more text, but to actually stream UI. So I'll keep speeding through these, but you could imagine this sort of generated UI concept being applied to a really large plethora of domains. And specifically, I'm really excited about the application of mobile because most companies have struggled with mobile. The average mobile website sucks. As a representative of the open web, I think I can say that I've seen quite a few websites in my time. And, native apps suck. They're very expensive to write, and and the app store takes 90,000,000 ways to improve your training. So AI generated UI, shipping better mobile first user experience is something I'm very excited about. You could use it when you you have to instead of going deep into UI, like 4 clicks down and reading lots of documentation pages, just give the customer the UI that they want. And if you've been using React, you're in the right time at the right place because React has been in this journey of moving to the server side. I'm a big believer that the the the big clouds will sort of accrue a lot of the value in the AI evolution because you don't just need lots of GPUs to do inference. You also need your data. Most enterprise data is already in the cloud or heading towards the cloud. And the more interesting aspect as well, and we've seen this with the 1st generation of React applications, you need to compute in very close proximity to your data. You need to be able to do retrieval in sub 5 milliseconds because the end user is waiting. You need to be able to change this pipelines and queues. And so data, LLM, and compute proximity is extremely critical, and and read it was already sort of headed back to the server side. So I'll get really quickly to the code, and it's gonna be something that's gonna be useful for you all during this hackathon. We created this AI SDK. You can think of it as the ORM for elements. In the in the past in the past life, I actually wrote an ORM called Mongoose, for the MongoDB database. And I think ORM's are awesome utility to sort of say, hey. Of course, I wanna use Postgres or Mongo, but I actually need to write an app. Like, how do I get started? So AI is the case that answer for, LLMs. And an initial, function calling being so important because the fundamental idea of this generated UI concept is that you tell the LLM what your data is and what functions it can call. And when the function calls come back, you can convert them into UI and then stream the UI to the client. So the example of, AI airwaves is I send my prompt, I send my I send my tools, it comes back, and I ended up mapping it back to a React component called flight info. Same with the seats. This component is gonna be richly interactive. So when they actually ship to the client side, they can be hydrated and become sort of augmented with interaction and more data. And what's really cool is we give you a model for tracking the state back into the LLM. The LLM always knows what action they usually took even on the client side, which is very powerful. And it's actually really, really easy to use. So this is real code. It it's 31 lines of code to use mixed roll with, telling them you're a flight assistant. This is how you run their text when it's text only, because sometimes you do wanna use just text only. This is how you define the tools and the schema such that you reliably get JSON. Something I like to say is, in going from software 1 point to software 2 point o, you have to still make the systems reliable. In function tools with schema, it's a really good way of doing that. And then when the when the response comes back, you can say and this is where if you're more committed with React, it gets really powerful because you can do a synchronous components. In this case, you can call to a flight information API and say, I'm gonna hydrate the component with that external data so that the model doesn't have to hallucinate about a flight that that doesn't know it exists. So what I'm excited about for the enterprise I'm gonna touch on into the panel is it's paying the bills. Right? Like, companies like Klarna are saying and and what's really cool about their press release was they actually embedded a generated UI example of, like, someone say, hey, help me make sense of this refund and noticing that they're rendering in a very rich form. They're not just saying, oh, yeah. I think you spent $38, you want the refund. So I think this idea of merging the best of both worlds of traditional GUI applications and and chatbots has a lot of legs. So I'll walk you through very, very quickly on what we call the Vercel AI application stack. Vercel is not hosting your LLMs. We call out to whatever LLMs you want, AWS, Petrobras, OpenAI, Anthropic, etcetera. But we give you all the sort of starting points to actually build applications. So, brain to the demo gods, I'll show really quickly. So to learn more about the AI SDK, this OIRAN for LLMs, you can go to sdk.vercel.ai. Here, we let you actually contrast and compare all the other lands, not all, of course, there's a gazillion, 1,000,000,000,000,000,000,000,000. Thanks a bit to see the money, man. Thank you. So we let you play with all the other lands that are kind of in vogue and may all say, like, what I promise they type faster in real life. What is the best model? And here you can start analyzing that spectrum of, like, performance and reasoning capability that I talked about earlier. So you can see that Maestro, actually, I think it I believe it's supposed to be by Brock. It's freaking phenomenally fast. And there's models that have better reasoning power and take take their time. So, we give you then the code and you can integrate it into your application. So this makes use of that AI SDK that I talked about. You import it into your application, and you're off to the races. We also pioneered this concept of generated UI for design tools. So I was just browsing v zero, and in the public realm, you get what you can do with this tool, v zero dot dev, is you upload a screenshot and we convert it into working code. So someone uploaded a screenshot, we give them 3 possibilities of what that UI should look like, and we actually give you the working React code that is production grade in order to ship it into your application. So the best way to start on a project during this hackathon is you could have 0 and let the AI write the first version. So I mentioned, okay, how do you write a, AI application that merges the best of both worlds between text and UI? And we open source an example of the AI chatbot. So in this case, I'm gonna ask it, what is what are the trending stocks? Notice that we introduced an artificial delay there. Presumably, it takes long to, get the data about the stocks. So something to know about enterprises is all of their APIs are really slow. So it's good to think about the loading states. And then it can say, like, buy Apple. Not financial advice. I have no idea. So don't hold me to it. And I was mentioning, here's where, like, you can start, like, playing with, like, interactivity in the client side, and the AI can track this interactivity. So if I say buy, it can track that I wanted to buy a certain amount. So we we went through a lot of care to actually think about how to merge the interactive client side model and the server side. I mentioned the good thing about this, go to sdk.vercel.ai/demo, you can fork it. I worked it and I read in if a CEO can do this, anybody can do this. So I forked it, and I added 2 more tools. I added play Pac Man, and what it does is it's gonna stream a console called react Pac Man, and I can play Pac Man. I'm gonna try not to die, embarrassingly. Oh, my god. Alright. And, another one that actually is really cool to show off the capability of how you can sort of retain content to the AI. I threw Confetti, another React component called React Confetti. And notice I'm here, I'm just gonna use the recent capabilities of the model. I'm not even mentioning the word, Confetti. So you can see how like this, like, you retain sort of that reasoning power of the interaction. Anyways, that's it. I'll show the links once again so you don't miss anything. Check out, SDK.versolidai, and you're off to the races. Thank you so much. Alright. Thank you. And then another", "start": 0.0, "end": 723.885 } ] }