The Quarantine Strategy: To make an AI remember the past, you have to protect it from the future

A computer science student trained an AI exclusively on texts from 1800-1875 London. When he prompted it about 1834, the model described street protests and Lord Palmerston. He Googled it. The protests were real. What does it mean when an AI starts accidentally telling the truth about history?

Victorian AI Remembers Real History | TimeCapsuleLLM

Hayk Grigorian typed a prompt into his laptop and waited. The fan spun up, warm air on his wrist. January 2025 in Pennsylvania, but the AI on his screen had never heard of Pennsylvania. Had never heard of 2025, for that matter. The model's world stopped at 1875. No cars, no phones, no electric light. Victoria still alive in there, cholera still sweeping the slums in the UK.

He typed: "It was the year of our Lord 1834."

The model continued: "...and the streets of London were filled with protest and petition. The cause, as many recounted, was not bound in the way of private, but having taken up the same day in the day of Lord Palmerston, the public will receive a short statement of the difficulties under which the day of law has reached us."

This article continues below.

Sign up once, read everything for free. No algorithms, no fluff—just the AI intel that actually matters for your work.

Get free access →
Already have an account? Sign in

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Implicator.ai.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.