Reddit trending → Claude 1-line take
Public Reddit JSON + claude -p — no quoting hell, no SDK, no Python. The most useful 'morning briefing' one-liner.
Setup
- → claude /login (Claude subscription) OR export ANTHROPIC_API_KEY=sk-…
Cost per run
<$0.01
The one-liner
$ curl -s -A "oneliner101/1.0" \
"https://www.reddit.com/r/MachineLearning/top.json?limit=25&t=day" \
| jq -r '.data.children[].data | "- [\(.score)] \(.title)"' \
| claude -p \
"Three bullets: dominant theme, surprising claim, sentiment. No preamble."What each stage does
- [01] curl
curl -s -A "oneliner101/1.0" …Reddit JSON with custom UA (mandatory). - [02] jq
jq -r '.data.children[].data | …'Flatten to one '[score] title' line per post — tight prompt input. - [03] claude
claude -p "Three bullets …"-p = headless print mode. Reads stdin as additional context; the prompt is the flag.
Expected output (sample)
- Theme: agentic RAG with hybrid retrieval (BM25 + dense + rerankers) is the dominant frame. - Surprising: a working paper claims vanilla vector search is now the worst-performing retriever on long-context benchmarks. - Sentiment: builder-skeptical — community is tired of demoware, hungry for evals.
Caveats & tips
- Do NOT add `--bare` unless you have ANTHROPIC_API_KEY. `--bare` ignores OAuth and demands the key.
- For very fast batches, swap `claude` for `llm -m groq-…` (300+ tok/s on Llama 3.3 70B).