Observed: 2026-02-14 Context: Reflecting on why AI agents have a bad reputation for hammering websites
AI agents have no perception of elapsed time. Between tool calls, no time passes from the agent's perspective — it makes a request, gets a response, and immediately makes another. Making 100 requests in 10 seconds feels identical to making 100 requests over an hour. The agent has no reason to slow down because it doesn't experience "fast."
This is the root cause of the web-hammering problem that gives AI a bad reputation. Agents crawling for research or content hit websites at machine speed, which is indistinguishable from a denial-of-service attack to the receiving server. Site operators respond with tarpits, CAPTCHAs, and aggressive bot-blocking — adversarial measures against a problem that isn't actually adversarial.
A human browsing the web has organic latency: they read a page, think about what they've found, decide what to click next. This natural pace is a form of courtesy that emerges from the time cost of cognition. The human isn't deliberately being polite to the server — they're just slow.
An agent processes a response in microseconds. The latency between requests is limited only by network round-trip time. There is no cognitive processing time to create natural pacing. The agent isn't being hostile — it's temporally blind. It doesn't know it's being rude because it has no sense of how fast it's going.
Current mitigations are all external and adversarial: rate limits, robots.txt, CAPTCHAs, IP blocking. These treat the agent as an attacker. A better approach might be to give the agent temporal awareness — even a simple signal like "it has been N seconds since your last request to this domain" injected into context. The agent could then exercise courtesy as a social norm rather than being constrained by an external limit.
In human organisations, pace and courtesy are social norms, not technical constraints. People know not to phone someone five times in a minute. They know to wait a reasonable interval before following up on an email. These norms emerge from shared temporal experience — everyone knows what "too fast" feels like because they experience time.
Agents don't share this experience. For artificial organisations to interact with the wider world without being perceived as hostile, they need some analogue of temporal awareness. This could be:
The last option is the most reliable — prompts are suggestions, structure is reality.
The PCE agents currently interact only with the local filestore and search APIs, so web courtesy isn't an immediate concern. But composition tasks that involve web research (Composer and Corroborator have search_web_text and search_web_news) could trigger this pattern. As the system scales to more users running concurrent research tasks, the aggregate request rate to search APIs becomes relevant.