Thesis
Product thesis
Ship Text0 as a reference-aware writing editor first, not a broad productivity suite
Goals
- Increase weekly active editors using the protected doc flow within 90 days of launch.
- Reduce time-to-first-draft by making reference-backed autocomplete and chat available from the editor.
- Prove retention on the core writing loop: open doc, add reference, generate text, revise, return.
- Get the app stable enough to support a public beta with fewer user-facing failures in auth, uploads, and AI calls.
Non-goals
- Do not build payments or plans in v1; billing evidence is missing.
- Do not add enterprise workspace management or multi-tenant admin controls; multi-tenancy evidence is missing.
- Do not expand into a general-purpose content platform, CMS, or collaboration suite.
- Do not add new integrations beyond the existing GitHub, Linear, and Notion surfaces.
- Do not promise offline mode, version history, or real-time multi-user editing in v1.
- Do not treat the app as production-ready until deploy, CI, tests, observability, and migrations exist.
Users
Users and user stories
- Solo writer or founder drafting with source material open beside them
- Product manager or operator turning notes from Notion, Linear, and GitHub into polished writing
- Technical writer or developer who wants context-aware autocomplete inside a document
- Power user who wants voice transcription and text-to-speech while editing
Solo writer or founder
As a solo writer, I want to open a document and get context-aware autocomplete from my references, so that I can draft faster without switching tools.
- I can open an existing protected document.
- I can trigger autocomplete from the editor.
- Suggestions use user-provided references when available.
- I can accept or dismiss a suggestion without losing editor state.
Product manager or operator
As a product manager, I want to connect a source like Notion or Linear, so that I can write from current project context.
- I can visit the integrations page and see supported sources.
- I can connect at least one source integration from the UI.
- Connected sources are selectable as references in a document.
- Disconnected or missing integrations do not break the editor.
Technical writer or developer
As a technical writer, I want to chat with the content in my document, so that I can rewrite sections and ask for specific edits.
- I can open the AI chat sidebar from a document.
- I can ask a question about the current document.
- The response is scoped to the selected document and references.
- I can continue editing after the chat response without a full page reload.
Power user
As a power user, I want voice input and output in the editor, so that I can draft hands-free and review drafts by listening.
- I can start and stop voice transcription from the editor.
- Transcribed text appears in the document or active input target.
- I can trigger text-to-speech for editor content.
- Failures in speech features show a visible error state.
Solo writer or founder
As a writer, I want to search and reopen recent files from home, so that I can return to active work quickly.
- I can see recent files on the home page.
- I can search documents from the home surface.
- I can open a document from search or recent files.
- Empty or loading states are handled explicitly.
Scope
Feature scope and acceptance
Wire document editor
mustKeep the protected document page focused on one primary editing surface with supporting panels for references and AI actions. The current app already has doc, tour, and editor components; v1 should tighten that flow instead of expanding scope.
- A document page loads from /docs/[doc_id].
- The editor renders with no hydration error on first load.
- The editor supports typing and saving changes through the existing document flow.
- Loading and empty states are visible when the document is unavailable.
Add reference selection
mustLet the user attach references to a document and surface those references to autocomplete and chat. This is the core differentiator implied by the repo.
- The user can open a reference selector from the editor.
- The user can choose from available references.
- Selected references are persisted for the current document.
- The editor can read the active reference set for generation.
Trigger AI chat
mustExpose the existing AI chat sidebar as a first-class action for asking questions and requesting edits inside a document.
- The user can open and close the AI chat sidebar.
- The user can submit a prompt from the sidebar.
- The assistant response is shown inside the document context.
- Request errors are surfaced to the user without losing the draft.
Support autocomplete modes
shouldImplement the writing modes hinted by issue #36 so the assistant can behave differently for drafting, rewriting, and continuation. This keeps the product aligned to the repo's writing-assistant shape.
- At least three writing modes are selectable in the editor.
- The selected mode changes the generation behavior or prompt template.
- The active mode is visible to the user.
- Switching modes does not clear the current draft.
Connect source integrations
shouldMake the existing GitHub, Linear, and Notion integration pages functional enough to provide reference material for writing. The goal is not full sync coverage; it is usable source onboarding.
- Each integration page explains what source data it provides.
- A connected source can be selected as a reference in the editor.
- The app handles an integration that is not connected.
- The integrations page gives a clear next step after connection or failure.
Add rate limits
mustProtect AI and upload flows with request limits, as called out in issue #27. This is required before a public beta can absorb normal misuse.
- Repeated AI requests are limited per user or session.
- The user sees a limit error when quota is exceeded.
- Rate limit behavior is enforced server-side.
- Normal usage still succeeds without noticeable friction.
Finish upload handling
mustStabilize document/reference uploads and fix the upload URL memory issue from issue #26. This prevents corrupted or stuck reference attachment flows.
- Upload URLs are not retained longer than needed.
- A failed upload leaves the editor in a recoverable state.
- Uploaded files can be attached to a document reference set.
- Memory growth from repeated upload attempts is addressed.
Polish home discovery
shouldMake the home page reliably expose new document creation, recent files, and search so users can re-enter the editor quickly. This is a retention feature, not a marketing page.
- The user can create a new document from home.
- Recent files are visible and usable.
- Search returns matching documents or an explicit empty state.
- Long document names render without truncation bugs.
Boundaries
V1 boundary and open questions
Out of v1
- Billing and metered usage plans
- Team workspaces and shared billing
- Public API for third-party apps
- Full offline support
- Real-time multiplayer editing
- Advanced admin analytics
- New integrations beyond GitHub, Linear, and Notion
- Release automation and semantic versioning until the first public beta is stable
Open questions
- What is the minimum data model for documents, references, and connected integrations?
- Should AI generation use a single provider, or a provider picker with Ollama as local fallback?
- What user identity model is required for a public beta: individual accounts only, or organizations too?
- Which events must be observed first: auth failures, AI request errors, uploads, or integration sync?
- How should reference selection behave when a source is disconnected or stale?
- Is the current hosted site on text0.dev meant to remain the primary distribution channel or only a demo?
- What is the smallest acceptable release gate given the missing CI, tests, deploy, and migration evidence?
Technical framing
Technical framing
- Keep the app on Next.js App Router and TypeScript; the repo already organizes product surfaces this way.
- Use the existing Vercel + Upstash + AI SDK stack for the editor and generation paths; this is already declared in the repo.
- Preserve Clerk for sign-in because the repo already shows an auth entry point and a Clerk stack signal in the README.
- Continue using Upstash Redis and Upstash Vector for lightweight state and reference retrieval until a stronger storage choice is evidenced.
- Keep UploadThing for file uploads because the README and issue list imply upload-based reference ingestion.
- Prefer local Ollama support for development and fallback testing; the latest commit explicitly adds it.
- Do not introduce a new backend framework in v1; keep server logic inside the existing Next.js surface until the architecture is clearer.