MORNING COFFEE

Clawdbot just broke the internet—and it runs entirely on your laptop.

The open-source project from Austrian developer Peter Steinberger exploded across tech communities this week, with MacStories editor Federico Viticci burning through 180 million tokens in seven days. The project's Discord server gained thousands of members in weeks. Security researchers at Hudson Rock published an urgent analysis on Sunday warning that info-stealers are already targeting Clawdbot installations.

The architecture is deceptively simple. Clawdbot is an LLM-powered agent that runs locally on your computer, connects to any messaging platform you already use—WhatsApp, Telegram, iMessage, Slack, Discord—and executes real actions on your behalf. It stores everything as folders and Markdown files on your filesystem. No cloud database. No subscription lock-in. Your data stays on your machine.

"Clawdbot has fundamentally altered my perspective of what it means to have an intelligent, personal AI assistant in 2026," Viticci wrote in his review. "It's an AI nerd's dream come true."

Three signals explain why this matters:

  1. Self-improvement is the killer feature — Viticci asked Clawdbot to give itself image generation capabilities. It researched the Google Nano Banana Pro API, created the integration, and prompted him to securely store credentials in macOS Keychain—without human guidance. The agent builds its own skills.

  2. Subscription economics are inverting — Viticci replaced Zapier automations with cron jobs Clawdbot built on his Mac mini. No monthly fee. No cloud dependency. Just shell access and API calls the agent assembled itself. When a $20/month Claude subscription can eliminate $50/month in automation tools, the maths gets interesting.

  3. The security model is terrifying — Hudson Rock's analysis found that Clawdbot stores API tokens, VPN credentials, and "cognitive context" (user memories, work patterns, personal anxieties) in plaintext Markdown files. One infostealer infection exposes everything. The privacy tradeoff is real.

The project runs on Claude Opus 4.5 by default but supports any major model. Steinberger—known for selling his startup PSPDFKit to Insight Partners—built it as a personal tool before open-sourcing the codebase. Community members are now contributing skills, MCP servers, and platform integrations at a rapid pace.

For context: Clawdbot is to ChatGPT what Obsidian is to Google Docs. It's local-first, infinitely customisable, and puts the user in complete control—with all the complexity and risk that implies.

The question everyone's asking: When consumer LLMs become this malleable by default, what happens to the app economy?

GROWTH HACK

The "Congress Tracker" Engine

Copy the trades of power—and build an audience doing it.

The Play: Politicians file financial disclosures with 45-day delays. By the time the public sees them, the trades are old news. This system scrapes, parses, and delivers the top 10 most notable trades daily—turning regulatory filings into a signal stream your audience actually wants.

Why This Works: STOCK Act disclosures are public but buried. The data is scattered across multiple sources, filed in inconsistent formats, and nearly impossible to parse at scale. Anyone who surfaces this information cleanly becomes the trusted source. And everyone wants to know what Nancy Pelosi bought.

The Implementation Stack:

Category

Tool

Notes

Scraping

Puppeteer + Browserbase

Handles CAPTCHA-protected gov sites

Parsing

GPT-4 via API

Extracts structured trade data from PDFs

Storage

Supabase

PostgreSQL with real-time subscriptions

Delivery

Beehiiv + GPT

Auto-generates daily briefings

Extension

Chrome MV3

Surfaces trades in real-time while browsing

Step 1: Browserbase (The Scraper)

// scrape_disclosures.js
const { chromium } = require('playwright');

async function scrapeDisclosures() {
  const browser = await chromium.connectOverCDP(process.env.BROWSERBASE_URL);
  const page = await browser.newPage();
  await page.goto('https://disclosures-clerk.house.gov/');
  const filings = await page.$$eval('.filing-row', rows => 
    rows.map(r => ({
      name: r.querySelector('.name').innerText,
      date: r.querySelector('.date').innerText,
      pdf_url: r.querySelector('a').href
    }))
  );
  return filings;
}

Step 2: GPT-4 (The Parser)

// parse_disclosure.js
const prompt = `Extract all stock trades from this disclosure:
- Ticker symbol
- Transaction type (buy/sell)
- Approximate amount range
- Transaction date
Return as JSON array.`;

const trades = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: prompt + pdfText }],
  response_format: { type: "json_object" }
});

Step 3: Supabase (The Database)

CREATE TABLE trades (
  id SERIAL PRIMARY KEY,
  politician TEXT,
  ticker TEXT,
  type TEXT,
  amount_range TEXT,
  trade_date DATE,
  disclosure_date DATE,
  created_at TIMESTAMP DEFAULT NOW()
);

CREATE INDEX idx_trades_ticker ON trades(ticker);
CREATE INDEX idx_trades_politician ON trades(politician);

Step 4: Daily Signal Generation

// generate_daily.js
const topTrades = await supabase
  .from('trades')
  .select('*')
  .gte('disclosure_date', yesterday)
  .order('amount_range', { ascending: false })
  .limit(10);

const newsletter = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{
    role: "user",
    content: `Write a 300-word briefing on these trades: ${JSON.stringify(topTrades)}`
  }]
});

Step 5: Chrome Extension (The Surface)

Build a MV3 extension that highlights any ticker on any webpage with politician trading activity. When users see $NVDA mentioned, they instantly see: "Rep. X bought $500K–$1M on Jan 15."

The Result:

  • Audience growth: Congressional trading content gets 10x engagement versus standard market analysis

  • Trust building: You become the transparency source—not just another finance commentator

  • Monetisation: Premium tiers for real-time alerts, API access for quant funds, sponsorship from fintech apps

One cold-call PDF. One automated pipeline. One audience that grows itself.

DAILY STAT

64% of Media Executives Say Back-End AI Is "Very Important"

The Number: 64% Unit/Timeframe: Media executives rating back-end automation (tagging, copyediting, transcription) as "very important" to their companies — 2025 survey

The Scale: The Reuters Institute surveyed 266 media leaders across 51 countries. Nearly two-thirds now consider AI-powered back-office operations critical infrastructure—not optional tooling.

The Human Comparison: In 2024, only 60% said the same. A 4-point jump in one year signals a shift from experimentation to dependency.

The Shift: Media companies aren't betting on AI to write articles—they're betting on AI to do everything except write articles. Tagging, transcription, copyediting, metadata generation. The invisible work that eats editorial budgets is being systematically automated.

The Economics: Here's what else moved: Coding and product development jumped from 28% to 44% (+57% YoY). Commercial uses like propensity-to-pay models climbed from 29% to 33%. Meanwhile, content creation with human oversight actually dropped from 30% to 29%. The message: executives trust AI for operations, not output.

What This Means for Solopreneurs:

  • The invisible work pays first — Don't pitch AI content creation. Pitch AI transcription, tagging, SEO optimisation. That's where budgets are moving.

  • Build the picks and shovels — Agencies that offer AI-powered back-end audits (transcription quality, metadata completeness, tagging accuracy) have an immediate market.

  • The 44% coding signal — Nearly half of media execs now see AI-assisted development as critical. Sell tools that help non-technical teams ship faster.

  • Human oversight premium — Content creation AI isn't winning trust. Position yourself as the human-in-the-loop that makes AI output publishable.

The winners in 2026 aren't replacing journalists. They're replacing the operations that distract journalists from journalism.

TOOL TIP

Databricks Mosaic AI — Enterprise AI infrastructure for production-grade agents

What It Does: Mosaic AI is Databricks' unified platform for building, deploying, and governing AI agents and models. It combines foundation model serving, vector search, agent frameworks, and AI gateway governance into a single stack that sits on your existing data lakehouse. Think of it as the full MLOps pipeline—from training to deployment to monitoring—without stitching together 15 different tools.

Pricing:

Tier

Price

Limits

Use Case

Pay-per-token

$0.001–$0.01/1K tokens

No commitment

Experimentation, prototyping

Provisioned Throughput

~$0.07–$0.65/DBU

Reserved capacity

Production workloads

Premium Workspace

~$0.55/DBU

RBAC, audit logs, governance

Enterprise teams

Enterprise Workspace

~$0.65+/DBU

HIPAA, FedRAMP compliance

Regulated industries

Who It's For:

  • Data teams with existing Databricks — Native integration means zero migration friction

  • Enterprise ML engineers — Production-grade serving with auto-scaling and governance

  • Compliance-heavy organisations — Built-in guardrails, audit logging, and data lineage

  • Teams building RAG pipelines — Vector Search + Agent Framework handles the full stack

What Makes It Different:

  • Unified governance — AI Gateway provides rate limiting, fallbacks, traffic splitting, and guardrails across any model (OpenAI, Anthropic, Meta, custom)

  • Any model, one API — Query GPT-4, Claude, Llama, or your fine-tuned model through identical endpoints

  • Data-native — Your agents can query Unity Catalog tables directly—no ETL to a separate vector database

  • Agent evaluation built-in — AI judges measure output quality so you can iterate before production

Core Capabilities:

  • Foundation Model Serving (GPT-4, Claude, Llama, Gemini)

  • Mosaic AI Vector Search with hybrid retrieval

  • Agent Framework for RAG and multi-step agents

  • Model Training for fine-tuning on proprietary data

  • AI Gateway for governance, monitoring, and cost control

Limitations:

  • DBU pricing is complex—costs can spike unexpectedly without monitoring

  • Requires Databricks ecosystem commitment; not a standalone tool

  • Enterprise features (HIPAA, FedRAMP) require highest-tier pricing

The Verdict: If you're already on Databricks and need to ship AI agents to production with enterprise governance, Mosaic AI is the obvious choice. If you're a startup exploring, the pay-per-token experimentation tier keeps costs low while you validate use cases.

TICKER WATCH

Archer Aviation (NYSE: ACHR) — $8.19

The Numbers That Matter:

Metric

Value

Current Price

$8.19

52-Week Low

$5.48

52-Week High

$14.62

Market Cap

$6.0B

Analyst Target

$4.50–$18.00 (avg $11.61)

Cash Position

$1.64B

What They Do (Simple Version): Archer builds electric air taxis—aircraft that take off and land vertically like helicopters but run on batteries and use 12 rotors instead of one. The Midnight aircraft carries four passengers on 10-20 minute flights over urban traffic. Think Uber, but you skip the motorway entirely.

Why This Matters: Archer just bought Hawthorne Airport—80 acres near LAX and SoFi Stadium—for $126 million. They're the Official Air Taxi Provider for the 2028 Los Angeles Olympics. The FAA's new eVTOL Integration Pilot Programme could accelerate certification. And Morgan Stanley analysts project the "low altitude economy" could reach $9 trillion.

Three things changed: (1) Hawthorne Airport acquisition gives them a permanent LA hub 3 miles from LAX, (2) Olympics partnership delivers global visibility and a hard deadline for commercial operations, (3) defence contracts with Anduril create a secondary revenue stream while waiting for FAA certification.

The Upside Case:

  • Conservative: $11.61 target = +42% from here

  • Bull case: $18.00 (Goldman Sachs) = +120% from here

  • Profitability: Pre-revenue; burning ~$130M/quarter; expect first revenue in 2026

Simple Maths: $1,000 invested today could become $1,420 (conservative) or $2,200 (bull case) within 12–18 months if FAA certification proceeds on schedule.

The Maths: Archer has $6 billion in backlog orders from United Airlines, Abu Dhabi Aviation, Ethiopian Airlines, and others. The Midnight aircraft is designed for 20-mile trips at roughly $3-5/mile—competitive with premium rideshare on a time-adjusted basis. At scale (500+ aircraft), unit economics become attractive. But "at scale" requires FAA type certification that analysts don't expect until 2027-2028. Every quarter of delay burns another $130M in cash.

The Risks (Be Honest):

  • FAA certification timeline remains uncertain—no firm date

  • Pre-revenue company burning $500M+ annually

  • Will likely need to raise more cash (dilution risk)

  • eVTOL technology is unproven at commercial scale

  • Competition from Joby, Lilium, and others intensifying

The Verdict: Archer is a binary bet on the future of urban aviation. The Olympics deadline creates a forcing function—they need to fly paying passengers by summer 2028 or the partnership becomes embarrassing. The Hawthorne acquisition and defence contracts show management isn't just waiting for certification; they're building infrastructure and diversifying revenue. But this is still a pre-revenue company trading on a story, not cash flows.

Position: Speculative. 1–2% max. High risk—this is venture capital in public markets.

Not financial advice. Do your own research.

WORKFLOW

Clawdbot Personal Automation Pipeline

Setup Time: 45 minutes | Weekly Value: Replace $50–$100/month in automation subscriptions with local cron jobs

Description: Build your own self-improving AI assistant that runs locally, connects to your existing messaging apps, and can create new capabilities on demand—no cloud dependency required.

Architecture:

Trigger: Telegram/WhatsApp message to Clawdbot
    |
Action 1: Gateway — Routes message to local Claude agent
    |
Action 2: Agent — Processes request, checks memory files, selects tools
    |
Action 3: Shell/MCP — Executes actions (API calls, file edits, cron jobs)
    |
Action 4: Memory — Logs interaction to daily Markdown file
    |
Outcome: Response delivered via original messaging platform

Step 1: Install Clawdbot (The Foundation)

# Clone and install
git clone https://github.com/clawdbot/clawdbot.git
cd clawdbot
pnpm install

# Run setup wizard
pnpm clawdbot wizard

# Start the gateway daemon
pnpm clawdbot gateway start

Signal: Gateway running locally on port 18789.

Step 2: Connect Telegram (The Interface)

# Link Telegram account
pnpm clawdbot channels login

# Configure allowlist (who can message your bot)
# Edit ~/.clawdbot/config.yaml
channels:
  telegram:
    allowFrom:
      - "+1555012345"  # Your phone number

Signal: You can now message your assistant from Telegram on any device.

Step 3: Create a Daily Briefing Skill

<!-- ~/clawd/skills/morning-briefing/SKILL.md -->
# Morning Briefing Skill

## Trigger
Cron: 0 7 * * * (7 AM daily)

## Actions
1. Fetch today's calendar events via Google Calendar API
2. Check Todoist for due tasks
3. Summarise top news from RSS feeds
4. Generate audio summary via ElevenLabs TTS
5. Send to Telegram with artwork from Nano Banana

## Output Format
- Text summary (3-5 bullet points)
- Audio file attachment
- AI-generated header image

Step 4: Set Up Cron Job

# Ask Clawdbot to create the cron job
# (In Telegram, send this message to your bot)

"Create a cron job that runs my morning-briefing skill 
every day at 7 AM. Include calendar, Todoist tasks, 
and an audio summary of my day ahead."

# Clawdbot will:
# 1. Create the skill file
# 2. Register the cron trigger
# 3. Test the workflow
# 4. Confirm via Telegram

Expansion Ideas:

  • Add RSS monitoring for industry news (replace Feedly)

  • Create invoice processing workflow (replace Zapier)

  • Build automated social posting from content calendar

  • Set up smart home controls via Philips Hue/HomeKit MCP servers

From $50/month in automation tools to a single Mac mini running in your cupboard. Your data, your rules, your agent.

THE BOTTOM LINE

Clawdbot isn't just another AI wrapper—it's a signal that the agent architecture wars are moving from data centres to desktop computers. When a developer can burn through 180 million tokens in a week building a personal assistant that replaces paid subscriptions and runs entirely on a Mac mini, the economics of cloud-first AI start to look shakier.

The media industry gets it. Sixty-four percent of executives now consider back-end AI "very important"—not for content creation, but for the invisible work that drains budgets. Same energy applies to personal productivity: the value isn't in AI writing your emails, it's in AI managing everything around your emails so you can focus on the work that matters.

The playbook: Start local, stay modular, own your data. Whether you're building Clawdbot skills, congressional disclosure trackers, or speculative positions in companies like Archer that are betting on physical-world automation, the pattern is the same: infrastructure that puts execution in your hands, not a platform's.

If you're building, look at what Clawdbot users are replacing—Zapier, IFTTT, even basic app functionality—and consider where your product fits. If you're investing, watch for the certification milestones (FAA for Archer, model capabilities for AI companies) that separate story from substance. If you're selling, lead with what runs locally—privacy and control are the new premium.

Ship daily.

HackrLife Daily is read by growth marketers at Google, Adobe, LinkedIn, and creators building the future.

Reply

Avatar

or to participate