Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Nov 8, 2025

This PR attempts to address Issue #9119. Feedback and guidance are welcome.

Changes

  • Added request and response fields to ClineApiReqInfo interface
  • Store request data when initiating API calls in Task.ts
  • Accumulate and store response data during streaming in Task.ts
  • Updated ChatRow component to display request/response in expandable UI
  • Added click handler to API request items to toggle visibility
  • Display request as formatted JSON and response as markdown

How it works

When a user clicks on an API request item in the chat, they can now see:

  • The full request sent to the LLM (formatted as JSON)
  • The complete response received from the LLM (displayed as markdown)

This provides transparency into what is being exchanged between Roo and the LLM, which is especially helpful for understanding model behaviors and debugging.

Fixes #9119


Important

Enhance API request items to display LLM request and response data in an expandable UI for better transparency and debugging.

  • Behavior:
    • Task.ts: Stores LLM request data when initiating API calls and accumulates response data during streaming.
    • ChatRow.tsx: Displays request/response data in an expandable UI when an API request item is clicked.
  • UI:
    • ChatRow.tsx: Adds click handler to toggle visibility of request/response data.
    • Displays request as formatted JSON and response as markdown.
  • Models:
    • ClineApiReqInfo in ExtensionMessage.ts: Adds response field to store response data.

This description was created by Ellipsis for 8ba9a6e. You can customize this summary. It will automatically update as commits are pushed.

- Added request and response fields to ClineApiReqInfo interface
- Store request data when initiating API calls in Task.ts
- Accumulate and store response data during streaming in Task.ts
- Updated ChatRow component to display request/response in expandable UI
- Added click handler to API request items to toggle visibility
- Display request as formatted JSON and response as markdown

Fixes #9119
@roomote roomote bot requested review from cte, jr and mrubens as code owners November 8, 2025 08:14
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. enhancement New feature or request labels Nov 8, 2025
@roomote
Copy link
Contributor Author

roomote bot commented Nov 8, 2025

Rooviewer Clock   Follow along on Roo Cloud

Reviewed the latest commit (8ba9a6e). The raw/formatted response toggle implementation looks good - no issues found.

Previous reviews

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Nov 8, 2025
@buz-ch
Copy link

buz-ch commented Nov 8, 2025

@roomote: Can you let the user toggle to see the raw response from the server please? Markdown is not useful to debug interaction issues

@roomote
Copy link
Contributor Author

roomote bot commented Nov 8, 2025

Fixaroo Clock   See task on Roo Cloud

Added a toggle button to switch between raw and formatted views of the LLM response. All local checks passed.

View commit | Revert commit

setShowRawResponse(!showRawResponse)
}}
title={showRawResponse ? "Show formatted" : "Show raw"}>
{showRawResponse ? "Formatted" : "Raw"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User-facing strings in the API response toggle (e.g. 'Response from LLM:', 'Raw', 'Formatted') are hardcoded. Please use the i18n translation function (e.g. t(...)) to support localization.

Suggested change
{showRawResponse ? "Formatted" : "Raw"}
{showRawResponse ? t("chat:apiRequest.formatted") : t("chat:apiRequest.raw")}

This comment was generated because it violated a code review rule: irule_C0ez7Rji6ANcGkkX.

@buz-ch
Copy link

buz-ch commented Nov 8, 2025

@roomote: raw should include the entire answer from the llm including headers etc

@roomote
Copy link
Contributor Author

roomote bot commented Nov 8, 2025

Fixaroo Clock   Follow along on Roo Cloud

Working on enhancing the raw response view to include the entire LLM response with headers, metadata, and thinking traces.

@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Nov 8, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Nov 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. size:M This PR changes 30-99 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

[ENHANCEMENT] Allow user to see request to LLM and response

4 participants