
Building an internal web app means expressing your organization’s unique processes in code and restricting access to employees only. Especially when you automate workflows with generative-AI services or a data warehouse, self-hosting with a JavaScript framework lets you meet requirements that generic SaaS cannot.
Why Internal-App Frameworks Are in the Spotlight
Two big trends drive attention toward JavaScript frameworks for internal apps:
1. AI-Powered Business Apps
Large language models (LLMs) offer huge potential for everyday automation. Off-the-shelf SaaS treats company-specific workflows as exceptions, so solutions rarely cover every corner of daily operations. Custom AI apps, by contrast, can encode complex logic—tasks that once required human judgment—directly in code.
When you need an AI business app, writing code is extremely effective: it enables deep customization and fits perfectly where manual processes still dominate.
2. BI as Code
BI as Code is the movement to manage business-intelligence dashboards in source control instead of low-code SaaS editors. Why switch from low-code to full code?
A BI dashboard relies on three layers:
- Data connectors – link data sources to the BI system
- Data modeling – clean and reshape raw schemas
- Data presentation – display the modeled data
Low-code tools handle presentation, but connectors and modeling still involve SQL or DML. Maintenance can balloon—especially as data volumes explode—so many teams find that managing everything as code actually reduces cost and friction.
Next.js + Vercel AI SDK: A Full-Stack Powerhouse for AI Apps
Overview
Next.js is the well-known full-stack React framework. It unifies front-end and back-end in a single codebase and offers server-side rendering (SSR) for SEO-friendly sites.
But SSR also means you can run Node.js on the server—perfect for AI apps. By wrapping calls to an AI provider inside an API route, you keep secret keys off the browser. The Vercel AI SDK builds on this strength, unifying API differences across OpenAI, Anthropic, Google, and others, and providing React hooks for common front-end patterns—especially the tricky business of streaming responses.
Minimal implementation
- API route that streams from an LLM
- Front end that calls the route via
useChat
// 1. API route
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({ model: openai('gpt-4o'), messages });
return result.toDataStreamResponse();
}
// 2. Client component
'use client';
import { useChat } from '@ai-sdk/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.parts.map((p, i) =>
p.type === 'text' ? <div key={`${m.id}-${i}`}>{p.text}</div> : null
)}
</div>
))}
<form onSubmit={handleSubmit}>
<input
className="fixed dark:bg-zinc-900 bottom-0 w-full max-w-md p-2 mb-8 border border-zinc-300 dark:border-zinc-800 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
);
}
Benefits for Internal Use
- One repo, one team: UI and API live together, so domain experts can touch both without strict front-end/back-end silos, speeding reviews and delivery.
- Secrets stay server-side: AI keys, SSO tokens, and internal APIs are safely hidden within server components or API routes.
- Cost optimization: Mix ISR for low-change reports with Edge Functions for latency-sensitive chat—performance and cost in a single codebase.
Caveats & Selection Tips
The killer feature is consolidating code and infrastructure—plus safe LLM streaming. Compared with Evidence and Observable (which focus on data visualization), Next.js shines when you need business logic, auth, AI inference, and a sophisticated UI in one app.
If you deploy on self-managed infra rather than Vercel, you’ll need custom build settings.
Evidence: A “BI as Code” Framework Built with SQL × Markdown
Overview
Evidence is an open-source framework that lets you assemble data apps using nothing but SQL and Markdown. Inline a SQL query in a *.md
file, pipe the result into components such as <LineChart>
or <DataTable>
, and you have a dashboard. Pages support SSG and SSR and live in Git, so BI as Code fits naturally into dev workflows.
Official connectors cover BigQuery, Snowflake, PostgreSQL, and more, plus a VS Code extension for analysts.
Because prose, SQL, and visualization coexist in one file, even non-engineers can follow the logic. Reference a query by ${ }
to chain results—ETL steps included.
Example
# Daily Sales (This Month)
```sql sales_by_day
SELECT
DATE_TRUNC('day', order_time) AS day,
SUM(total_amount) AS sales
FROM warehouse.orders
WHERE order_time >= DATE_TRUNC('month', CURRENT_DATE)
GROUP BY 1
ORDER BY 1
```
Total sales: **${formatCurrency(total_sales)}**
Benefits
- Spin up fast:
npx create-evidence@latest
in minutes, then deploy to Vercel, Netlify, or Cloudflare Pages. - No front-end expertise required: Analysts who know SQL can ship dashboards without React or Svelte.
Caveats
Queries run at build time, so millisecond-level live KPIs aren’t ideal. If you need free-form UI, use Next.js with your favorite chart library; if you need rapid, SQL-driven reports, Evidence is the simplest path.
Observable: Notebook-Style Development for Interactive Data Apps
Overview
Observable began as an interactive notebook, and with Observable Framework (OSS) it now lets you build data apps in code. Combine JavaScript/TypeScript with SQL, Python, or R loaders; cells recompute automatically via a dependency graph. Sites are statically built for high speed and deploy easily to Vercel or Netlify.
Example
// cells/notebook.js — Framework ≥1.13
import * as Plot from "@observablehq/plot";
import { sql } from "@observablehq/framework/sql";
viewof metric = Inputs.select(["sales", "orders"], {label: "Metric"})
data = sql`
SELECT DATE_TRUNC('day', order_time) AS day,
SUM(${metric}) AS value
FROM warehouse.orders
WHERE order_time >= DATE_TRUNC('month', CURRENT_DATE)
GROUP BY 1
ORDER BY 1
`
chart = Plot.line(data, {x: "day", y: "value"})
Switching metric
triggers all dependent cells and instantly updates the chart. This declarative reactivity pairs well with LLM-guided exploration.
Benefits
- Exploration becomes documentation: Start in a notebook, then deploy as a static site—analysis notes become the company’s data portal.
- Advanced viz out of the box: D3 and Observable Plot make interactive visualization (including LLM outputs) straightforward.
- DuckDB-Wasm support: For small-to-medium data, run entirely in the browser—no back-end ops.
Caveats
- Large dependency graphs can hurt readability; impose directory structure and naming rules.
- After Observable Cloud’s sunset, you’ll need other services for scheduled builds or secrets.
- If you need RBAC or comprehensive user management, Next.js or another full-stack framework offers more features.
Final thoughts
Framework | Key Strengths | Typical Use Cases |
---|---|---|
Next.js + Vercel AI SDK | Unified full-stack + LLM workflow; easy auth/RBAC hooks | Internal AI workflows with sensitive data; complex business apps |
Evidence | Dashboards from SQL × Markdown; optimized for GitOps | Standardized reports, exec KPIs, large multi-page BI sites |
Observable | Cell-based reactive UI + rich visualizations; great for exploration | Data-exploration portals; rapid experimentation & sharing of AI results |
Pick the framework that matches your team’s skills and goals to balance development speed and operational cost.