Le Chat is the app developed by Mistral AI, the pioneering French artificial intelligence company. Like ChatGPT or DeepSeek, Le Chat allows everyone, everywhere, to chat with Mistral’s models. Published a few months ago, the app is powered by React Native and Expo.
The talk will be a technical dive into one central building block of the app: text streaming. Allowing LLM’s answers to appear word by word on screen, HTTP streaming is a classic method - yet a polyfill is still required for React Native's fetch. Why? How does it work exactly? What are the specific native considerations around it? And what else does it power within the React Native ecosystem?
Get in touch!
hi@guild.host