Skip to content

Conversation

@matt423
Copy link
Member

@matt423 matt423 commented Jan 16, 2026

Description

Similar to the message-per-token example, demonstrate the message-per-response pattern using message appends for both JS and React.

Review App

Checklist

Summary by CodeRabbit

  • New Features

    • Added JavaScript and React example applications demonstrating AI Transport message-per-response streaming patterns on Ably channels, enabling users to stream tokens into single messages with live delivery and historical recovery capabilities.
    • Included comprehensive getting-started guides and documentation for both implementations.
  • Chores

    • Updated Ably dependency to version 2.17.0.

✏️ Tip: You can customize this high-level summary in your review settings.

@matt423 matt423 self-assigned this Jan 16, 2026
@matt423 matt423 added the review-app Create a Heroku review app label Jan 16, 2026
@coderabbitai
Copy link

coderabbitai bot commented Jan 16, 2026

Walkthrough

Adds two new example projects demonstrating AI Transport message-per-response streaming patterns: one using vanilla JavaScript and another using React. Both include mock LLM streaming, Ably channel integration, UI components, and configuration files. Updates shared dependencies and example registry.

Changes

Cohort / File(s) Summary
JavaScript Example - Documentation & Configuration
examples/ai-transport-message-per-response/javascript/README.md, index.html, package.json, styles.css, tailwind.config.ts, vite.config.ts
New example project setup including HTML entry point, npm scripts, and Tailwind/Vite config with environment variable support (VITE_ABLY_KEY)
JavaScript Example - Business Logic
examples/ai-transport-message-per-response/javascript/src/agent.ts, src/config.ts, src/llm.ts, src/script.ts
Core streaming logic: Agent class consumes LLM token streams and publishes to Ably channels via message.append; MockLLM simulates streaming responses; browser-based UI script manages channel subscriptions (message.create, message.append, message.update) and displays live tokens
React Example - Documentation & Configuration
examples/ai-transport-message-per-response/react/README.md, index.html, package.json, postcss.config.js, tailwind.config.ts, tsconfig.json, tsconfig.node.json, vite.config.ts
New React project setup with TypeScript configurations, PostCSS for Tailwind, and environment directory override for shared .env.local
React Example - Business Logic
examples/ai-transport-message-per-response/react/src/App.tsx, src/agent.ts, src/config.ts, src/index.tsx, src/llm.ts, src/styles/styles.css
React component implementation with per-serial response caching, connection state listener, message event handlers (create/append/update), and rewind support; shared Agent and MockLLM logic replicated from JavaScript example
Shared Updates
examples/package.json, src/components/Examples/ExamplesRenderer.tsx, src/data/examples/index.ts
Added workspace for JavaScript example, new npm script, Ably SDK version bump (2.16.0 → 2.17.0), and registered new example in examples registry

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Suggested reviewers

  • GregHolmes
  • mschristensen

Poem

🐰 Tokens stream like cotton candy,
One message, many tokens appended,
Ably carries them swiftly,
React and JavaScript, both befriended,
AI Transport flows so grandly! 🎉

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and concisely describes the main change: adding message-per-response example code for AI Transport, which aligns with the PR's primary objective.
Linked Issues check ✅ Passed The PR successfully implements both JavaScript and React examples demonstrating message-per-response pattern with token appending, fully addressing AIT-210 requirements.
Out of Scope Changes check ✅ Passed All changes are scoped to the message-per-response example implementation and supporting configuration; version bump of Ably library is appropriate for the example updates.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Comment @coderabbitai help to get the list of available commands and usage tips.

@ably-ci ably-ci had a problem deploying to ably-docs-ait-210-messa-cgauoa January 16, 2026 13:17 Failure
@matt423
Copy link
Member Author

matt423 commented Jan 16, 2026

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Jan 16, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🤖 Fix all issues with AI agents
In `@examples/ai-transport-message-per-response/javascript/src/script.ts`:
- Around line 60-67: handlePromptClick creates a new Agent (which opens an
Ably.Realtime connection) but never calls agent.disconnect(), leaking
connections; update handlePromptClick to either reuse a single Agent instance
stored outside the function or ensure you await Agent.processPrompt(...) (if it
returns a Promise) and call agent.disconnect() in a finally block to always
clean up; reference the handlePromptClick function and the Agent
class/agent.disconnect()/Agent.processPrompt symbols when making the change.

In `@examples/ai-transport-message-per-response/react/package.json`:
- Around line 1-10: The examples workspace is missing the React package entry so
its dependencies aren't hoisted; open the examples/package.json and add the
string "ai-transport-message-per-response/react" to the workspaces array
(alongside "ai-transport-message-per-response/javascript") so the React package
(name: "ai-transport-message-per-response-react") is registered and its deps
(react, vite, etc.) are resolved from the monorepo root.

In `@examples/ai-transport-message-per-response/react/src/App.tsx`:
- Around line 69-71: The Agent is created on each prompt click but never
disconnected, leaking Ably.Realtime connections; change the click handler to
store the Agent instance (e.g., in component state or a ref) instead of a
transient local variable and call its disconnect() when replacing or cleaning up
the previous instance, and also call agent.disconnect() in a useEffect cleanup
on component unmount to ensure all Ably clients created by Agent are closed;
refer to the Agent class/constructor and its disconnect() method when
implementing these changes.

In `@examples/ai-transport-message-per-response/react/src/config.ts`:
- Around line 1-3: The ABLY_KEY fallback in export const config currently uses
'demo-key-for-examples:YOUR_ABLY_KEY_HERE'; update the ABLY_KEY value to use the
same fallback as the JavaScript example ('YOUR_ABLY_KEY_HERE') so the React
variant matches the JS example, by editing the ABLY_KEY assignment in the config
object.

In `@examples/package.json`:
- Around line 8-12: The examples package.json is missing the React workspace and
run script for the new "ai-transport-message-per-response" example; add the
workspace "ai-transport-message-per-response/react" to the workspaces array
(next to "ai-transport-message-per-response/javascript") and add the
corresponding npm script entry mirroring the JS script (create a React script
name matching the existing JS script pattern for
"ai-transport-message-per-response") so dependencies install and the React
example can be run.

In `@src/data/examples/index.ts`:
- Around line 23-24: Update the example's metaTitle and metaDescription fields
to reference "AI Transport" instead of "Ably Pub/Sub": locate the example entry
that defines metaTitle and metaDescription (the metaTitle and metaDescription
properties in this example object) and replace the Ably-specific copy with
wording that aligns with AI Transport branding and mentions streaming
AI-generated tokens via AI Transport so the tag ai_transport matches the meta
content.
🧹 Nitpick comments (8)
examples/ai-transport-message-per-response/react/src/index.tsx (1)

5-9: Consider adding null safety for the root element.

document.getElementById('root') can return null if the element doesn't exist, which would cause createRoot to throw. For example code this is acceptable, but you could add a non-null assertion for explicitness:

♻️ Optional: Add non-null assertion
-createRoot(document.getElementById('root')).render(
+createRoot(document.getElementById('root')!).render(
   <StrictMode>
     <App />
   </StrictMode>,
 );
examples/ai-transport-message-per-response/react/tsconfig.json (1)

11-12: Consider using "Bundler" module resolution for Vite projects.

For Vite-based projects with "module": "ESNext", the "Bundler" resolution mode is the recommended modern option as it better aligns with how bundlers resolve modules.

♻️ Optional: Use Bundler resolution
     "module": "ESNext",
-    "moduleResolution": "Node",
+    "moduleResolution": "Bundler",
examples/ai-transport-message-per-response/react/src/agent.ts (1)

35-40: Await appendMessage (or explicitly handle errors) to avoid dropped failures.
If channel.appendMessage() returns a Promise, not awaiting it can surface unhandled rejections and make token appends race each other. Consider awaiting each append to preserve ordering/backpressure, or add a brief comment if the fire‑and‑forget approach is intentional.

✅ Minimal change
         if (msgSerial && event.text) {
-          this.channel.appendMessage({
+          await this.channel.appendMessage({
             serial: msgSerial,
             data: event.text,
           });
         }

Based on learnings, keep error handling simple but document any intentional trade-offs.

examples/ai-transport-message-per-response/react/src/App.tsx (2)

34-46: Nested state setter pattern may cause stale closure issues.

The message.append handler uses setCurrentSerial to access the current serial, then conditionally calls setResponses inside. While this avoids stale closures, it's an unusual pattern that may confuse maintainers. The outer setter returns current unchanged, using it purely for read access.

Consider using a useRef for currentSerial if you only need to read it without triggering re-renders, or combine both pieces of state into a single reducer for cleaner logic.


8-13: Module-level initialization creates a second Ably connection.

The app creates an Ably client at module level (line 11-13), and then each Agent instance also creates its own client (per agent.ts). This means two separate Ably connections exist simultaneously.

For an example/demo this is acceptable, but consider adding a comment explaining this is intentional (agent simulates a backend service with its own connection) to avoid confusion.

examples/ai-transport-message-per-response/javascript/package.json (1)

1-10: Package configuration looks correct for a workspace example.

The package uses ESM modules and standard Vite scripts. Dependencies are inherited from the workspace root.

Consider adding "private": true to prevent accidental npm publishing, which is a common practice for example packages.

📦 Suggested addition
 {
   "name": "ai-transport-message-per-response-javascript",
   "version": "1.0.0",
+  "private": true,
   "type": "module",
   "scripts": {
examples/ai-transport-message-per-response/javascript/src/agent.ts (1)

36-41: Consider awaiting appendMessage to preserve token ordering.

The appendMessage call is not awaited, meaning tokens could theoretically arrive out of order if the network introduces variable latency. For a demo this is likely fine, but awaiting would guarantee correct sequencing.

♻️ Optional fix
       if (msgSerial && event.text) {
-          this.channel.appendMessage({
+          await this.channel.appendMessage({
           serial: msgSerial,
           data: event.text,
         });
examples/ai-transport-message-per-response/javascript/src/script.ts (1)

83-94: Consider awaiting async handlers or adding .catch() for unhandled rejections.

The async functions handleConnect and handleDisconnect are called without await in the toggle handler and at load time (line 94). This could lead to unhandled promise rejections if the operations fail.

For example code, this is a minor concern, but adding a simple .catch(console.error) would prevent uncaught errors in the browser console.

♻️ Optional improvement
-const handleConnectionToggle = () => {
+const handleConnectionToggle = async () => {
   if (channel.state === 'attached') {
-    handleDisconnect();
+    await handleDisconnect();
   } else {
-    handleConnect();
+    await handleConnect();
   }
 };

 connectionToggle.onclick = handleConnectionToggle;
 promptButton.onclick = handlePromptClick;

-handleConnect();
+handleConnect().catch(console.error);
📜 Review details

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Jira integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 6ec9fe2 and ec4f12c.

⛔ Files ignored due to path filters (1)
  • examples/yarn.lock is excluded by !**/yarn.lock, !**/*.lock
📒 Files selected for processing (27)
  • examples/ai-transport-message-per-response/javascript/README.md
  • examples/ai-transport-message-per-response/javascript/index.html
  • examples/ai-transport-message-per-response/javascript/package.json
  • examples/ai-transport-message-per-response/javascript/src/agent.ts
  • examples/ai-transport-message-per-response/javascript/src/config.ts
  • examples/ai-transport-message-per-response/javascript/src/llm.ts
  • examples/ai-transport-message-per-response/javascript/src/script.ts
  • examples/ai-transport-message-per-response/javascript/src/styles.css
  • examples/ai-transport-message-per-response/javascript/tailwind.config.ts
  • examples/ai-transport-message-per-response/javascript/vite.config.ts
  • examples/ai-transport-message-per-response/react/README.md
  • examples/ai-transport-message-per-response/react/index.html
  • examples/ai-transport-message-per-response/react/package.json
  • examples/ai-transport-message-per-response/react/postcss.config.js
  • examples/ai-transport-message-per-response/react/src/App.tsx
  • examples/ai-transport-message-per-response/react/src/agent.ts
  • examples/ai-transport-message-per-response/react/src/config.ts
  • examples/ai-transport-message-per-response/react/src/index.tsx
  • examples/ai-transport-message-per-response/react/src/llm.ts
  • examples/ai-transport-message-per-response/react/src/styles/styles.css
  • examples/ai-transport-message-per-response/react/tailwind.config.ts
  • examples/ai-transport-message-per-response/react/tsconfig.json
  • examples/ai-transport-message-per-response/react/tsconfig.node.json
  • examples/ai-transport-message-per-response/react/vite.config.ts
  • examples/package.json
  • src/components/Examples/ExamplesRenderer.tsx
  • src/data/examples/index.ts
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-12-23T11:43:36.746Z
Learnt from: matt423
Repo: ably/docs PR: 3054
File: examples/ai-transport-token-streaming/react/src/BackendLLMService.ts:47-79
Timestamp: 2025-12-23T11:43:36.746Z
Learning: In all example code under the examples directory, prioritize brevity and clarity over production-grade error handling. Use simple, easy-to-understand error handling in examples to keep the focus on demonstrating usage, and clearly document any trade-offs or simplifications in comments so readers understand the differences from production code.

Applied to files:

  • examples/ai-transport-message-per-response/javascript/src/config.ts
  • examples/ai-transport-message-per-response/javascript/src/script.ts
  • examples/ai-transport-message-per-response/javascript/src/agent.ts
  • examples/ai-transport-message-per-response/javascript/src/llm.ts
  • examples/ai-transport-message-per-response/react/src/agent.ts
  • examples/ai-transport-message-per-response/react/tailwind.config.ts
  • examples/ai-transport-message-per-response/react/src/llm.ts
  • examples/ai-transport-message-per-response/javascript/vite.config.ts
  • examples/ai-transport-message-per-response/javascript/tailwind.config.ts
  • examples/ai-transport-message-per-response/react/vite.config.ts
  • examples/ai-transport-message-per-response/react/src/config.ts
🧬 Code graph analysis (3)
examples/ai-transport-message-per-response/javascript/src/script.ts (2)
examples/ai-transport-message-per-response/javascript/src/config.ts (1)
  • config (1-3)
examples/ai-transport-message-per-response/javascript/src/agent.ts (1)
  • Agent (8-52)
examples/ai-transport-message-per-response/javascript/src/agent.ts (1)
examples/ai-transport-message-per-response/javascript/src/llm.ts (1)
  • MockLLM (10-49)
examples/ai-transport-message-per-response/react/src/App.tsx (2)
examples/ai-transport-message-per-response/react/src/config.ts (1)
  • config (1-3)
examples/ai-transport-message-per-response/react/src/agent.ts (1)
  • Agent (8-52)
🪛 LanguageTool
examples/ai-transport-message-per-response/javascript/README.md

[style] ~43-~43: This adverb was used twice in the sentence. Consider removing one of them or replacing them with a synonym.
Context: ...the value of VITE_ABLY_KEY to be your Ably API key. 5. Install dependencies: `...

(ADVERB_REPETITION_PREMIUM)


[style] ~61-~61: This adverb was used twice in the sentence. Consider removing one of them or replacing them with a synonym.
Context: ...ur VITE_ABLY_KEY variable to use your Ably API key.

(ADVERB_REPETITION_PREMIUM)

examples/ai-transport-message-per-response/react/README.md

[style] ~43-~43: This adverb was used twice in the sentence. Consider removing one of them or replacing them with a synonym.
Context: ...the value of VITE_ABLY_KEY to be your Ably API key. 5. Install dependencies: `...

(ADVERB_REPETITION_PREMIUM)


[style] ~61-~61: This adverb was used twice in the sentence. Consider removing one of them or replacing them with a synonym.
Context: ...ur VITE_ABLY_KEY variable to use your Ably API key. ## How it works The message-...

(ADVERB_REPETITION_PREMIUM)

🔇 Additional comments (23)
examples/ai-transport-message-per-response/react/src/styles/styles.css (1)

1-3: Standard Tailwind CSS setup.

The directives correctly import Tailwind's base styles, component classes, and utility classes in the proper order.

examples/ai-transport-message-per-response/react/tsconfig.node.json (1)

1-10: Standard Vite TypeScript configuration.

The configuration correctly targets the Vite config file with appropriate compiler options for the bundler environment.

examples/ai-transport-message-per-response/react/index.html (1)

1-12: Standard Vite + React entry point.

The HTML scaffold correctly sets up the document with proper charset, viewport meta tags, a root element for React, and a module script entry point.

src/components/Examples/ExamplesRenderer.tsx (1)

40-40: Dependency version bump is appropriate.

Ably SDK 2.17.0 exists and introduces useful new features including realtime update/delete operations and message append support, which align well with the AI Transport examples being added. The tilde version range (~2.17.0) safely permits patch updates while preventing unintended minor version upgrades. No breaking changes reported for this release.

examples/ai-transport-message-per-response/react/tailwind.config.ts (1)

1-9: LGTM!

Clean configuration that properly extends the shared base config and specifies appropriate content paths for the React TypeScript project.

examples/ai-transport-message-per-response/react/postcss.config.js (1)

1-6: LGTM!

Standard PostCSS configuration for Tailwind CSS with the correct plugin setup.

examples/ai-transport-message-per-response/react/vite.config.ts (1)

1-6: Looks good—envDir override aligns with base config.

examples/ai-transport-message-per-response/javascript/README.md (1)

1-61: Clear and well-structured walkthrough.

examples/ai-transport-message-per-response/react/src/llm.ts (1)

1-48: Mock LLM stream implementation is clear and easy to follow.

examples/ai-transport-message-per-response/react/src/App.tsx (1)

15-132: Overall implementation is correct for demonstrating the message-per-response pattern.

The component properly handles:

  • message.create for initial message creation
  • message.append for token streaming
  • message.update for history rewind/recovery
  • Connection state management and channel detach/reattach with rewind option

The UI correctly disables the prompt button when disconnected or channel is detached.

examples/ai-transport-message-per-response/react/README.md (1)

1-70: Documentation is comprehensive and well-structured.

The README clearly explains:

  • The message-per-response streaming pattern and its benefits
  • All relevant Ably components and their purposes
  • Step-by-step getting started instructions
  • How the pattern works at a high level

The static analysis hints about "adverb repetition" are false positives and can be safely ignored.

examples/ai-transport-message-per-response/javascript/src/styles.css (1)

1-3: Standard Tailwind CSS setup.

This is the correct minimal configuration for Tailwind CSS.

examples/ai-transport-message-per-response/javascript/index.html (1)

1-49: Clean HTML structure for the JavaScript example.

The markup is well-organized with:

  • Proper semantic structure and viewport meta tag
  • Tailwind utility classes for responsive styling
  • Clear UI sections for status, response display, and prompt selection
  • Correct ES module script import
examples/ai-transport-message-per-response/javascript/src/config.ts (1)

1-3: LGTM!

Clean and simple configuration that follows Vite conventions for environment variables with a clear placeholder fallback.

examples/ai-transport-message-per-response/javascript/tailwind.config.ts (1)

1-9: LGTM!

Configuration correctly extends the shared base config and specifies appropriate content paths for Tailwind CSS purging.

examples/ai-transport-message-per-response/javascript/vite.config.ts (1)

1-7: LGTM!

Vite configuration correctly extends the base config and sets envDir to the parent directory for shared environment variables.

examples/ai-transport-message-per-response/javascript/src/llm.ts (1)

1-49: LGTM!

Well-structured mock LLM service that clearly demonstrates the streaming pattern with message_start, message_delta, and message_stop events. The random chunking and delays effectively simulate realistic LLM token streaming behavior.

examples/ai-transport-message-per-response/javascript/src/agent.ts (2)

1-52: Good demonstration of the message-per-response pattern.

The implementation clearly shows the core concept: publishing an initial message to obtain a serial, then appending tokens to that same message. The code is well-commented and easy to follow. Based on learnings, the simplified error handling is appropriate for example code.


29-33: The API is confirmed to be available and stable in Ably JS SDK v2.17.0. The channel.publish() method returns a PublishResult object that includes a serials array, which is part of the message appends feature introduced in this release and is documented in the official Ably changelog and SDK documentation.

examples/ai-transport-message-per-response/javascript/src/script.ts (4)

1-11: LGTM!

Clean setup with proper imports and a unique per-session channel name using crypto.randomUUID(). The initialization is straightforward and appropriate for this example.


13-27: LGTM!

The response tracking using a Map keyed by message serial is a clean approach for managing multiple responses. The updateDisplay function correctly guards against null currentSerial.


29-58: Well-structured message handling.

The subscription logic correctly handles all three message actions (create, append, update) with appropriate guards. The check if (currentSerial === serial) for appends is a good safeguard against out-of-order token delivery. The comment about rewind behavior is helpful context for readers.


69-81: LGTM!

The connect/disconnect handlers are clean and focused. Setting the rewind option before attaching ensures history is delivered as message.update events. Based on learnings, simplified error handling is acceptable for example code.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

@matt423 matt423 force-pushed the ait-210-message-per-response-example branch from ec4f12c to d3043a1 Compare January 16, 2026 13:37
@matt423 matt423 temporarily deployed to ably-docs-ait-210-messa-cgauoa January 16, 2026 13:38 Inactive
Prevent multiple agent(and Realtime) .
@matt423 matt423 temporarily deployed to ably-docs-ait-210-messa-cgauoa January 16, 2026 13:52 Inactive
@matt423 matt423 marked this pull request as ready for review January 16, 2026 13:53
@matt423 matt423 requested a review from mschristensen January 16, 2026 14:02
@mschristensen mschristensen merged commit 20df5cb into AIT-129-AIT-Docs-release-branch Jan 16, 2026
7 checks passed
@mschristensen mschristensen deleted the ait-210-message-per-response-example branch January 16, 2026 14:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-app Create a Heroku review app

Development

Successfully merging this pull request may close these issues.

4 participants