April 15 - Now Streaming (0.1.18)

You're new favorite streaming service is... Mito? Mito AI now streams responses live to the frontend — no more waiting, no more wondering what it’s thinking.

🌀 Streaming Updates

Responses now stream live to the frontend for a faster, more interactive experience.

  • Streaming is supported for: Chat, Smart Debug, and Code Explain.

  • Agent mode streaming will be added in a future update.

  • Available to all users, whether you're using your own API key or the Mito server.

📤 Output Added to Context

Mito AI now includes cell output as part of its context, allowing for more accurate and informed responses.

  • Output reading is enabled for both Chat and Agent messages.

  • A new UI indicator shows when output is being read.

🧠 GPT-4.1 by Default

Most Mito AI tools now use GPT-4.1 as the default model, providing improved performance and reasoning.

🐛 Bug Fixes & Improvements

  • Removed unnecessary metadata from chat history to reduce context window issues.

  • Fixed citation issues.

  • Agent mode no longer shows a cell preview when sending messages.

  • Chat naming no longer counts toward usage quota.

  • Agent now waits for cell execution before sending the next message.

  • Mito AI now defaults to Agent mode when appropriate.

Last updated

Was this helpful?