The ghost in the machine that erases the soul of your words

A new term is emerging in the debate around artificial intelligence and writing: semantic ablation. Claudio Nastruzzi writes for The Register that AI tools do not just add errors to text. They also systematically destroy what makes writing distinctive in the first place.

Semantic ablation describes how AI models erode high-value, precise, or unconventional language when processing or “polishing” text. The cause lies in how these models are built. During training, they learn to favor the most statistically common word choices. Reinforcement learning from human feedback, a standard tuning method, reinforces this tendency further by penalizing unusual or complex language.

The process unfolds in three stages. First, the AI removes distinctive metaphors and vivid imagery, replacing them with safe, familiar phrases. Second, it substitutes specific technical terms with more common but less precise alternatives. Third, it flattens complex reasoning into predictable, template-like structures.

The result, Nastruzzi argues, is a “JPEG of thought”: text that looks clean and readable but has lost its original density of meaning. He contrasts this with hallucination, the well-known problem of AI inventing false information. Where hallucination adds what is not there, semantic ablation destroys what is.

The concern extends beyond individual texts. Nastruzzi warns that widespread reliance on AI writing tools could gradually impoverish the language and reasoning humans use to communicate.

About the author

Related posts:

Stay up-to-date:

Advertisement