-
Notifications
You must be signed in to change notification settings - Fork 614
v0.13 Release Notes, Docs Updates, Onboarding Updates #2642
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Caution Review failedThe pull request is closed. WalkthroughAdds release notes and many docs/UX updates for v0.13.0 including Local AI / BYOK, new provider support, a unified configuration widget, terminal improvements, and styling tweaks. Extends AIMetrics with Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes
Pre-merge checks and finishing touches❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (4)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
pkg/aiusechat/usechat.go (1)
63-69: Consider using URL parsing for more robust localhost detection.The current substring matching approach could produce false positives (e.g., "mylocalhost.com" would match) and misses IPv6 localhost ("::1"). For telemetry accuracy, consider parsing the endpoint URL and checking the hostname component directly.
Example refactor using URL parsing:
func isLocalEndpoint(endpoint string) bool { if endpoint == "" { return false } - endpointLower := strings.ToLower(endpoint) - return strings.Contains(endpointLower, "localhost") || strings.Contains(endpointLower, "127.0.0.1") + parsedURL, err := url.Parse(endpoint) + if err != nil { + return false + } + host := strings.ToLower(parsedURL.Hostname()) + return host == "localhost" || host == "127.0.0.1" || host == "::1" || strings.HasPrefix(host, "127.") }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
pkg/aiusechat/uctypes/uctypes.go(1 hunks)pkg/aiusechat/usechat.go(4 hunks)pkg/telemetry/telemetrydata/telemetrydata.go(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
pkg/aiusechat/usechat.go (1)
pkg/aiusechat/uctypes/uctypes.go (1)
AIProvider_Custom(35-35)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Build for TestDriver.ai
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Analyze (go)
🔇 Additional comments (4)
pkg/telemetry/telemetrydata/telemetrydata.go (1)
151-152: LGTM!The new telemetry fields for tracking AI provider and locality are properly structured with appropriate JSON tags and naming conventions consistent with existing WaveAI telemetry properties.
pkg/aiusechat/uctypes/uctypes.go (1)
335-336: LGTM!The new AIMetrics fields are properly defined with consistent JSON tags and naming. These fields will enable tracking of AI provider types and whether endpoints are local, which aligns with the PR's objective to capture AI provider metadata.
pkg/aiusechat/usechat.go (2)
364-381: LGTM!The provider computation correctly defaults to
AIProvider_Customwhen unspecified, and the metrics are properly populated with both the provider and locality information. The flow from configuration to metrics is clean and straightforward.
581-582: LGTM!The telemetry fields correctly propagate the AI provider and locality information from the metrics, completing the data flow for tracking BYOK/Local AI usage. The naming is consistent with existing WaveAI telemetry conventions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
frontend/app/aipanel/byokannouncement.tsx (1)
4-47: BYOKAnnouncement cleanly encapsulates config + docs entry pointUsing
WaveAIModel.getInstance()and delegating tomodel.openWaveAIConfig()gives a focused, reusable entry point for BYOK/local AI configuration, and the inline docs link is a nice touch. If you later see config-opening failures in practice, you could wraphandleOpenConfigin a try/catch and surface errors viamodel.setError, but it’s fine as-is for now.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (19)
ROADMAP.md(2 hunks)docs/docs/secrets.mdx(1 hunks)docs/docs/waveai-modes.mdx(5 hunks)docs/docs/waveai.mdx(1 hunks)docs/src/components/versionbadge.css(1 hunks)docs/src/components/versionbadge.tsx(1 hunks)frontend/app/aipanel/aipanel.tsx(3 hunks)frontend/app/aipanel/byokannouncement.tsx(1 hunks)frontend/app/app.scss(1 hunks)frontend/app/element/toggle.tsx(2 hunks)frontend/app/onboarding/onboarding-common.tsx(1 hunks)frontend/app/onboarding/onboarding-features.tsx(1 hunks)frontend/app/onboarding/onboarding-upgrade-minor.tsx(5 hunks)frontend/app/onboarding/onboarding-upgrade-patch.tsx(2 hunks)frontend/app/onboarding/onboarding-upgrade-v0130.tsx(1 hunks)frontend/app/onboarding/onboarding-upgrade.tsx(2 hunks)frontend/app/onboarding/onboarding.tsx(3 hunks)pkg/aiusechat/uctypes/uctypes.go(1 hunks)pkg/aiusechat/usechat-mode.go(4 hunks)
✅ Files skipped from review due to trivial changes (1)
- frontend/app/onboarding/onboarding-common.tsx
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-10-21T05:09:26.916Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2465
File: frontend/app/onboarding/onboarding-upgrade.tsx:13-21
Timestamp: 2025-10-21T05:09:26.916Z
Learning: The onboarding upgrade modal in frontend/app/onboarding/onboarding-upgrade.tsx intentionally freezes the lastVersion at mount using a ref to prevent the modal from changing or disappearing mid-interaction when the user is going through the onboarding flow.
Applied to files:
frontend/app/onboarding/onboarding-upgrade-patch.tsxfrontend/app/onboarding/onboarding-upgrade.tsxfrontend/app/onboarding/onboarding-upgrade-v0130.tsxfrontend/app/onboarding/onboarding-upgrade-minor.tsx
🧬 Code graph analysis (1)
frontend/app/element/toggle.tsx (1)
frontend/util/util.ts (1)
cn(502-502)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Analyze (go)
- GitHub Check: Build for TestDriver.ai
🔇 Additional comments (27)
frontend/app/onboarding/onboarding-upgrade-v0130.tsx (1)
4-93: v0.13 content component wiring and copy look solidThe structure and copy for the v0.13 patch content (Local AI & BYOK, Configuration Widget, and Terminal Updates) are clear, aligned with the release theme, and self‑contained. No logic or type issues stand out here.
frontend/app/onboarding/onboarding-upgrade.tsx (1)
10-38: Version routing to Minor vs Patch flows remains correct and stableThe switch to
UpgradeOnboardingMinorfor pre‑v0.12 installs while keepingUpgradeOnboardingPatchfor>= v0.12.0looks correct, and theinitialVersionRefpattern still cleanly freezeslastVersionat mount to avoid mid‑interaction changes.Based on learnings, this preserves the intended upgrade‑modal behavior.
frontend/app/onboarding/onboarding-upgrade-patch.tsx (1)
20-51: Patch carousel wiring for v0.13.0 is consistent and correctly navigableThe new v0.13.0 entry and updated v0.12.5
nextTextare wired correctly: the carousel opens on v0.13.0 by default,prev/nextlabels match the neighboring versions, and the content hook intoUpgradeOnboardingModal_v0_13_0_Contentis type‑safe and in line with existing entries.frontend/app/onboarding/onboarding-upgrade-minor.tsx (1)
19-207: Minor upgrade flow rename and v0.13 welcome content look goodRenaming the component to
UpgradeOnboardingMinoris cleanly applied (component, displayName, export), and the updated v0.13 welcome copy plus width tweaks integrate smoothly with the existing two‑page flow and final meta write toCurrentOnboardingVersion. No functional or typing issues stand out in the changed sections.frontend/app/app.scss (1)
36-41: Plain-link hover underline is a good affordance tweakAdding
text-decoration: underlineon hover for.plain-linkimproves link discoverability while keeping the base style subtle. No issues from a CSS/specificity standpoint.frontend/app/element/toggle.tsx (1)
5-35: Toggle now composes classes cleanlyExtending
TogglePropswithclassNameand routing it throughcn("check-toggle-wrapper", className)is a clean way to allow external styling while preserving the base wrapper class.frontend/app/aipanel/aipanel.tsx (1)
24-25: BYOKAnnouncement placement in the welcome view looks appropriateImporting and rendering
<BYOKAnnouncement />insideAIWelcomeMessagecleanly surfaces BYOK/local configuration at first-run without affecting the main chat or telemetry gating logic. No behavioral regressions apparent.Also applies to: 89-162
docs/docs/secrets.mdx (1)
11-11: VersionBadge noLeftMargin usage is consistent with the new APIUsing
noLeftMargin={true}here aligns with the new.version-badge.no-left-marginstyle and tightens the heading layout without affecting content.frontend/app/onboarding/onboarding-features.tsx (1)
129-135: Nice addition highlighting BYOK/local models in onboardingThe new key-icon row succinctly advertises BYOK and local model support in the Wave AI feature tour, matching the rest of the v0.13 messaging without impacting logic.
docs/src/components/versionbadge.css (1)
15-17: no-left-margin helper keeps VersionBadge backward-compatibleThe
.version-badge.no-left-marginrule cleanly opts out of the default left margin for specific call sites without affecting existing badges.docs/docs/waveai.mdx (1)
77-87: Link convention is already correctThe internal link uses the
.mdxextension as established throughout the repository. Other docs files consistently use this pattern (e.g.,./secrets.mdx,./telemetry-old.mdx,./waveai.mdx), so no change needed.docs/src/components/versionbadge.tsx (1)
5-9: LGTM!Clean implementation of the optional
noLeftMarginprop with appropriate conditional class application. The approach is idiomatic React.pkg/aiusechat/uctypes/uctypes.go (1)
333-334: LGTM!The new
AIProviderandIsLocalfields extendAIMetricsappropriately for tracking provider-specific telemetry. The JSON tags follow the existing conventions in the struct.frontend/app/onboarding/onboarding.tsx (2)
55-55: LGTM!The simplified label works well in context since it's already within the telemetry section, making the full "Telemetry Enabled/Disabled" label redundant.
121-140: Clean simplification of the telemetry UI.Replacing the custom Toggle with a native checkbox reduces component dependencies while maintaining functionality. The label wrapping the input provides good click-target accessibility.
ROADMAP.md (3)
16-19: LGTM!Good documentation of the newly supported AI providers with version tags for traceability.
39-41: LGTM!BYOK and Local AI agent capabilities appropriately marked as completed for v0.13.
52-53: LGTM!File operation capabilities accurately reflect the v0.13 feature set.
docs/docs/waveai-modes.mdx (4)
7-9: LGTM!Good use of the VersionBadge component to indicate this documentation applies to v0.13.
50-94: Well-structured configuration guidance.The new sections for setting default AI mode and hiding cloud modes provide clear, actionable instructions with both CLI and UI approaches.
434-455: Comprehensive field reference.The table format provides excellent quick reference for all configuration options with clear required/optional indicators.
447-447: The documented default2025-04-01-previewis a valid and officially supported preview API version by Azure OpenAI. No changes are needed.pkg/aiusechat/usechat-mode.go (5)
18-33: Well-organized endpoint and secret name constants.Good consolidation of provider-specific endpoints and secret names into named constants. This improves maintainability and reduces magic strings throughout the codebase.
75-84: Clean endpoint selection logic for OpenAI.The switch statement appropriately selects the endpoint based on API type, with a sensible default fallback to the chat completions endpoint.
100-104: LGTM!OpenRouter endpoint defaulting follows the same pattern as OpenAI.
107-120: Azure Legacy provider defaults look good.The endpoint template construction with resource name, deployment, and API version is clean. The validation check
isValidAzureResourceNamebefore constructing the endpoint prevents malformed URLs.
141-154: Google Gemini provider configuration.The endpoint template using the model name for Gemini's streaming endpoint is correct. Capabilities default appropriately for Google's vision-capable models.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
frontend/app/view/waveconfig/waveconfig-model.ts (1)
552-561: Consider deduping telemetry call and distinguishing create vs update actionsThe telemetry block here is identical to the one in
saveSecret, and both use"action:type": "waveconfig:savesecret". If you care about analytics detail, you might:
- Extract a small helper (e.g.,
recordSecretSaveEvent(isNew: boolean)) to avoid duplication, and/or- Use different action types for add vs edit (e.g.,
waveconfig:addsecretvswaveconfig:savesecret).Not required, but would improve maintainability and telemetry clarity.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
frontend/app/view/waveconfig/waveconfig-model.ts(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
frontend/app/view/waveconfig/waveconfig-model.ts (2)
frontend/app/store/wshclientapi.ts (1)
RpcApi(682-682)frontend/app/store/wshrpcutil.ts (1)
TabRpcClient(37-37)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Analyze (go)
- GitHub Check: Build for TestDriver.ai
🔇 Additional comments (1)
frontend/app/view/waveconfig/waveconfig-model.ts (1)
471-480: Secret-save telemetry placement and payload look correctTelemetry is emitted only after a successful
SetSecretsCommandand the payload does not include secret names or values, which is good from a correctness and privacy standpoint. Usingnoresponse: truealso keeps this fire‑and‑forget and avoids blocking the user flow.
No description provided.