Persistent Thought Bubble Expansion Controls

Current Behavior

Currently in Cursor, AI thought bubbles (the “Thought for Xs” sections) automatically:

  • Expand during AI response generation
  • Collapse immediately when the AI finishes responding
  • Require manual clicking to re-expand for reading the full thinking process

Problem Statement

This auto-collapse behavior creates several usability issues:

When trying to understand how the AI approached a problem or why it made certain decisions, users must remember to click the collapsed bubble before it disappears, or manually re-expand it after completion.

The AI’s reasoning process is valuable for learning coding patterns, debugging approaches, and understanding complex problem-solving steps. The current behavior makes this harder to access.

Users focused on watching the AI’s thinking process get interrupted when the bubble suddenly collapses, breaking their reading flow.

Inconsistent UX: Users can’t predict or control when they’ll be able to read the full reasoning, creating an inconsistent experience.

Proposed Solutions

1. Settings-Based Control

Add options in Cursor Settings > General:

   "cursor.chat.expandThoughts": true,
   "cursor.chat.autoCollapseThoughts": false

2. Per-Conversation Controls

Add UI controls directly on thought bubbles:

  • Pin icon: Click to “pin” a thought bubble open permanently
  • Auto-collapse toggle: Per-conversation setting to override global preferences
  • Expand all/Collapse all: Buttons to manage multiple thought bubbles at once

3. Keyboard Shortcuts

  • Cmd/Ctrl + Shift + T: Toggle thought bubble expansion for current message
  • Cmd/Ctrl + Shift + A: Toggle all thought bubbles in current conversation

Use Cases

Developers studying AI reasoning patterns for complex algorithms or architectural decisions need persistent access to thinking processes.

When the AI’s output isn’t quite right, understanding its reasoning helps users craft better follow-up prompts.

Teachers/students using Cursor for learning want to study the problem-solving approach, not just the final answer.

Users creating tutorials or sharing AI interactions want to include the reasoning process in screenshots or explanations.

Mobile or small screen users need predictable control over what content takes up screen real estate.