How it works

Here's exactly what your snail does.

Give snail a job. It connects to your tools, watches for patterns, tests ideas, and brings you a weekly brief with what it found and what it did. Here's how.

STEP 01

Create a Snail

Give your snail a name and a job — "increase my website conversion rate" or "boost SEO and organic web traffic"

Each snail focuses on one thing. Want to monitor sales AND code quality? Create two snails. They're $29/mo total, not per snail.

New snail
Namesales-snail
PurposeMonitor web traffic and run A/B testing to improve conversion rates.
STEP 02

Connect Your Data

snail plugs into the tools you already use: GitHub, Vercel, Gmail, CRM, analytics, or anything with webhooks.

OAuth for the big platforms, custom webhooks for everything else. Your snail will watch, propose changes, and auto-deploy once approved

Integrations
GitHubconnected
Vercelconnected
Gmailconnected
Plausiblepending
STEP 03

Crawls

Your snail will automatically run "crawls" on a daily cadence, analyzing and sorting data. With it's job in mind, it will proactively propose changes to achieve it's goal.

Mission · Cloud Spend
ObjectiveReduce Vercel + AWS spend by 20%
Last crawl2h ago · 14 sources
StatusAnalyzing
STEP 04

Trails & Signals

After each crawl, your snail leaves a trail — a summary of what it found and what it thinks it means.

When something important comes up, it raises a signal — flagged by severity so you know what needs attention now versus what can wait.

Recent signals
criticalBounce rate +42% on /pricing since deploy
warning3 dependencies with CVEs ≥ high
infoEmail open rate trending down 8%
STEP 05

Act on it

Every signal comes with a recommended next step. "Bounce rate jumped 40% after the last deploy — here's a fix." Review it, approve it, move on.

Proposed action
TypePull request · fix hero CLS
Est. impactBounce rate -30%

slow and steady

Too many tools are focusing on speed when slow and steady wins the race.

snails don't race to provide an instant insight. They patiently analyze data, run tests, and propose changes only when the data supports it. thorough — cross-referencing more sources and spending more time on analysis instead of trying answer you in two seconds.

Slow doesn't mean lazy. It means deliberate. A snail that spends 20 minutes analyzing a week of data and surfaces one thing you need to act on is worth more than a chatbot that answers instantly and tells you nothing new.

“Slow and Steady wins the race.” That's our motto and the reason snails are so effective.

Infrastructure

Powered by Courier.

snail runs on Courier — private AI inference optimized for Apple Silicon that can be hosted on your own Mac Mini or Mac Studio.

Courier offers flat-rate cloud APIs as well as the option to self-host for unlimited usage and total privacy.

Learn about Courier

Put a snail on it.

Join the waitlist. We'll email you when your snail is ready.

Questions? jackson@thinkrecursion.ai