← Back to AI Insights
Gemini Executive Synthesis

The core problem is the `boneyard-js` CLI crawler's inability to access authentication-protected routes, leading to incorrect skeleton generation. The proposed solution involves adding mechanisms to pass authentication context (cookies, session headers) to the Chromium session.

Technical Positioning
The tool aims to be usable for "any dashboard or internal tool where every route requires auth." This implies a positioning towards enterprise or internal application development where authentication is standard. The proposed API options (cookie file, auth headers, config file) suggest a focus on developer flexibility and integration with existing auth patterns (JWT, session tokens, OAuth).
SaaS Insight & Market Implications
This issue highlights a critical functional gap for `boneyard-js`: its inability to operate within authenticated environments. The current implementation renders the tool "effectively unusable for any dashboard or internal tool," a significant market segment for skeleton loading frameworks. Developers require robust mechanisms to pass authentication context, specifically cookies or authorization headers, to the underlying Chromium session. The proposed API options demonstrate a clear need for flexible integration with standard enterprise authentication patterns like JWT, session tokens, and OAuth. Failure to address this severely limits adoption in secure application development, forcing developers to manually bypass security or abandon the tool. This directly impacts the product's viability for any serious B2B application.
Proprietary Technical Taxonomy
CLI crawler auth-protected routes cookies session headers JWT cookie session token OAuth Chromium session

Raw Developer Origin & Technical Request

Source Icon GitHub Issue Apr 4, 2026
Repo: 0xGF/boneyard
CLI crawler fails on auth-protected routes — no way to pass cookies or session headers

## Problem

`npx boneyard-js build` spins up Chromium and visits the dev server, but
if the app is behind authentication (JWT cookie, session token, OAuth, etc.)
the crawler hits the login redirect and snapshots the login page instead of
the real component layout — generating useless or empty bones.

This makes boneyard-js effectively unusable for any dashboard or internal
tool where every route requires auth.

## Reproduction

1. Have a Next.js app with middleware protecting all routes under `/dashboard`
2. Wrap a dashboard component with ``
3. Run `npx boneyard-js build localhost/dashboard/stats`
4. CLI visits the URL, middleware redirects to `/login`, bones are generated
from the login page DOM instead

## Expected behavior

The CLI should provide a way to pass auth context to the Chromium session
so it can reach protected pages and snapshot the real component layout.

## Proposed API

**Option A — Cookie file (most flexible):**
```bash
npx boneyard-js build --cookies ./boneyard-cookies.json
```
Where the JSON is a Playwright/Puppeteer-compatible cookies array:
```json
[{ "name": "session", "value": "abc123", "domain": "localhost", "path": "/" }]
```

**Option B — Auth headers:**
```bash
npx boneyard-js build --header "Authorization=Bearer "
```

**Option C — boneyard.config.js:**
```js
export default {
auth: {
cookies: [{ name: 'session', value: process.env.SESSION_TOKEN }],
headers: { Author...

Developer Debate & Comments

No active discussions extracted for this entry yet.

Engagement Signals

0
Replies
open
Issue Status

Cross-Market Term Frequency

Quantifies the cross-market adoption of foundational terms like OAuth and cookies by tracking occurrence frequency across active SaaS architectures and enterprise developer debates.