CAPTCHA tokens expire. reCAPTCHA tokens last 90–120 seconds, Cloudflare Turnstile tokens around 300 seconds. Redis's native TTL operations make it the natural choice for caching tokens close to expiration, maintaining pre-solved token pools, and deduplicating in-flight requests.
Token Lifecycle in Redis
Solve Request → Check Redis → Cache Hit?
│ │
│ No │ Yes → Return cached token
▼
CaptchaAI API
│
▼
Store in Redis (TTL = token_lifetime - safety_margin)
│
▼
Return token
Python Implementation
Connection and Configuration
import os
import time
import json
import redis
import requests
r = redis.Redis(
host=os.environ.get("REDIS_HOST", "localhost"),
port=int(os.environ.get("REDIS_PORT", 6379)),
db=0,
decode_responses=True
)
API_KEY = os.environ["CAPTCHAAI_API_KEY"]
# TTLs with safety margin (seconds before actual expiration)
TOKEN_TTLS = {
"recaptcha_v2": 80, # Actual: ~120s, cache for 80s
"recaptcha_v3": 80,
"hcaptcha": 80,
"turnstile": 250, # Actual: ~300s, cache for 250s
}
Token Cache Operations
def cache_key(sitekey, pageurl):
"""Generate Redis key for a specific CAPTCHA target."""
return f"captcha:token:{sitekey}:{pageurl}"
def get_cached_token(sitekey, pageurl):
"""Pop a cached token from the queue."""
key = cache_key(sitekey, pageurl)
token = r.lpop(key)
if token:
# Verify TTL still valid on the list
ttl = r.ttl(key)
if ttl > 10: # At least 10 seconds remaining
return token
return None
def cache_token(sitekey, pageurl, token, captcha_type="recaptcha_v2"):
"""Push a solved token to the cache with appropriate TTL."""
key = cache_key(sitekey, pageurl)
ttl = TOKEN_TTLS.get(captcha_type, 80)
r.rpush(key, token)
r.expire(key, ttl)
Solve with Cache
def solve_recaptcha(sitekey, pageurl, captcha_type="recaptcha_v2"):
"""Solve reCAPTCHA with Redis cache check."""
# 1. Check cache
cached = get_cached_token(sitekey, pageurl)
if cached:
return {"solution": cached, "source": "cache"}
# 2. Check if solve is already in progress (dedup)
lock_key = f"captcha:lock:{sitekey}:{pageurl}"
if not r.set(lock_key, "1", nx=True, ex=120):
# Another worker is solving — wait for result
for _ in range(60):
time.sleep(2)
cached = get_cached_token(sitekey, pageurl)
if cached:
return {"solution": cached, "source": "cache_wait"}
return {"error": "TIMEOUT_WAITING_FOR_OTHER_WORKER"}
try:
# 3. Solve via CaptchaAI
resp = requests.post("https://ocr.captchaai.com/in.php", data={
"key": API_KEY,
"method": "userrecaptcha",
"googlekey": sitekey,
"pageurl": pageurl,
"json": 1
})
data = resp.json()
if data.get("status") != 1:
return {"error": data.get("request")}
captcha_id = data["request"]
for _ in range(60):
time.sleep(5)
result = requests.get("https://ocr.captchaai.com/res.php", params={
"key": API_KEY, "action": "get",
"id": captcha_id, "json": 1
}).json()
if result.get("status") == 1:
token = result["request"]
cache_token(sitekey, pageurl, token, captcha_type)
return {"solution": token, "source": "api"}
if result.get("request") != "CAPCHA_NOT_READY":
return {"error": result.get("request")}
return {"error": "TIMEOUT"}
finally:
r.delete(lock_key)
Token Pool (Pre-Solving)
Maintain a pool of ready tokens for high-throughput targets:
import threading
class TokenPool:
def __init__(self, sitekey, pageurl, pool_size=5, captcha_type="recaptcha_v2"):
self.sitekey = sitekey
self.pageurl = pageurl
self.pool_size = pool_size
self.captcha_type = captcha_type
self.pool_key = f"captcha:pool:{sitekey}:{pageurl}"
self._running = False
def start(self):
self._running = True
thread = threading.Thread(target=self._refill_loop, daemon=True)
thread.start()
def stop(self):
self._running = False
def _refill_loop(self):
while self._running:
current = r.llen(self.pool_key)
if current < self.pool_size:
self._solve_and_add()
time.sleep(2)
def _solve_and_add(self):
resp = requests.post("https://ocr.captchaai.com/in.php", data={
"key": API_KEY,
"method": "userrecaptcha",
"googlekey": self.sitekey,
"pageurl": self.pageurl,
"json": 1
})
data = resp.json()
if data.get("status") != 1:
return
captcha_id = data["request"]
for _ in range(60):
time.sleep(5)
result = requests.get("https://ocr.captchaai.com/res.php", params={
"key": API_KEY, "action": "get",
"id": captcha_id, "json": 1
}).json()
if result.get("status") == 1:
ttl = TOKEN_TTLS.get(self.captcha_type, 80)
r.rpush(self.pool_key, result["request"])
r.expire(self.pool_key, ttl)
return
if result.get("request") != "CAPCHA_NOT_READY":
return
def get_token(self):
return r.lpop(self.pool_key)
# Usage
pool = TokenPool("6Le-wvkSAAAAAPBMRTvw0Q4Muexq9bi0DJwx_mJ-", "https://example.com")
pool.start()
# When you need a token:
token = pool.get_token()
JavaScript Implementation
const Redis = require("ioredis");
const axios = require("axios");
const redis = new Redis(process.env.REDIS_URL || "redis://localhost:6379");
const API_KEY = process.env.CAPTCHAAI_API_KEY;
const TOKEN_TTLS = { recaptcha_v2: 80, recaptcha_v3: 80, hcaptcha: 80, turnstile: 250 };
function cacheKey(sitekey, pageurl) {
return `captcha:token:${sitekey}:${pageurl}`;
}
async function getCachedToken(sitekey, pageurl) {
const key = cacheKey(sitekey, pageurl);
const token = await redis.lpop(key);
if (token) {
const ttl = await redis.ttl(key);
if (ttl > 10) return token;
}
return null;
}
async function solveWithCache(sitekey, pageurl, type = "recaptcha_v2") {
// Check cache
const cached = await getCachedToken(sitekey, pageurl);
if (cached) return { solution: cached, source: "cache" };
// Dedup lock
const lockKey = `captcha:lock:${sitekey}:${pageurl}`;
const locked = await redis.set(lockKey, "1", "NX", "EX", 120);
if (!locked) {
for (let i = 0; i < 60; i++) {
await new Promise((r) => setTimeout(r, 2000));
const waitCached = await getCachedToken(sitekey, pageurl);
if (waitCached) return { solution: waitCached, source: "cache_wait" };
}
return { error: "TIMEOUT_WAITING" };
}
try {
const submit = await axios.post("https://ocr.captchaai.com/in.php", null, {
params: { key: API_KEY, method: "userrecaptcha", googlekey: sitekey, pageurl, json: 1 },
});
if (submit.data.status !== 1) return { error: submit.data.request };
const captchaId = submit.data.request;
for (let i = 0; i < 60; i++) {
await new Promise((r) => setTimeout(r, 5000));
const poll = await axios.get("https://ocr.captchaai.com/res.php", {
params: { key: API_KEY, action: "get", id: captchaId, json: 1 },
});
if (poll.data.status === 1) {
const key = cacheKey(sitekey, pageurl);
const ttl = TOKEN_TTLS[type] || 80;
await redis.rpush(key, poll.data.request);
await redis.expire(key, ttl);
return { solution: poll.data.request, source: "api" };
}
if (poll.data.request !== "CAPCHA_NOT_READY") return { error: poll.data.request };
}
return { error: "TIMEOUT" };
} finally {
await redis.del(lockKey);
}
}
Redis Key Design
| Key Pattern | Purpose | TTL |
|---|---|---|
captcha:token:{sitekey}:{pageurl} |
Cached solved tokens | 80–250s (per type) |
captcha:lock:{sitekey}:{pageurl} |
Dedup lock for in-flight solves | 120s |
captcha:pool:{sitekey}:{pageurl} |
Pre-solved token pool | 80–250s |
captcha:stats:{date} |
Daily solve counters | 7 days |
Monitoring Redis Cache Performance
def cache_stats():
info = r.info("stats")
hits = info.get("keyspace_hits", 0)
misses = info.get("keyspace_misses", 0)
total = hits + misses
return {
"hit_rate": f"{hits / total * 100:.1f}%" if total else "0%",
"hits": hits,
"misses": misses,
"active_keys": r.dbsize()
}
Troubleshooting
| Issue | Cause | Fix |
|---|---|---|
| Cached token rejected by target site | Token expired before use | Reduce TTL safety margin or use tokens immediately |
| Lock never released | Worker crashed during solve | TTL on lock key auto-cleans (120s) |
| Token pool always empty | Solve time exceeds refill rate | Increase pool size or add more refill threads |
| Redis memory growing | No TTL on keys | Every key should have a TTL; check with redis-cli --bigkeys |
FAQ
Should I cache CAPTCHA tokens?
Only for high-throughput scenarios targeting the same sitekey. For single solves, caching adds complexity without benefit. Pre-solved token pools shine when you need sub-second CAPTCHA responses.
What's the right TTL safety margin?
Subtract 30–40 seconds from the actual token lifetime. reCAPTCHA tokens last ~120 seconds, so cache for 80. This gives your application 40 seconds to consume the token.
Can I share Redis cache across multiple workers?
Yes — that's the primary use case. All workers check and populate the same cache. The dedup lock prevents multiple workers from solving the same CAPTCHA simultaneously.
Next Steps
Speed up your CAPTCHA pipeline with Redis-backed token caching — get your CaptchaAI API key.
Related guides:
Discussions (0)
Join the conversation
Sign in to share your opinion.
Sign InNo comments yet.