Why It (Didn’t) Work: What marketers can learn from America’s ‘Scared Straight’ initiative
- richard shotton
- Jul 7
- 4 min read
Published in Marketing Week, July 2025.

Marketers can have a tendency to see research as definitive answer – but as was the case with the Scared Straight programme – sometimes it only confirms a bias.
In 1970s America, the rates of juvenile delinquency were on the rise, arrests among 14-17s were on the up and gangs busily recruiting. Something had to be done.
In the face of rising youth offending, lawmakers came up with a new idea: to scare young people off crime. These so-called ‘Scared Straight’ programmes involved taking busloads of troublemaking teens to visit prisons, where they would witness the discomforts, deprivations and indignities of prison life.
The first prison to lead such a programme was Rahway, now known as East Jersey State Prison. There would be talks from the inmates, who would share horrifying experiences of prison assault, beatings and murder. The kids were sometimes forced into intense confrontations with criminals, facing verbal abuse — and sometimes even physical intimidation — until they broke down in tears.
The aim was to put off juveniles from committing the kind of offences that could land them there for real.
The programme became famous as the subject of a documentary, narrated by Columbo actor Peter Falk, called ‘Scared Straight!’ which won the 1978 Oscar for Best Documentary Feature, as well as two Emmys.
Questionably effective
If you believed the documentary, Scared Straight — also known as a Juvenile Awareness Program — worked exceptionally well. At the end of the film, we learn that 8,000 young people had been through the programme and an impressive 80% of them had stayed out of trouble.
Following the first programme’s apparent success, the Scared Straight idea took off, with officials convinced it would help to turn around young people headed down the wrong path. For at least another 20 years after the first youth prison visit, Juvenile Awareness Programs were adopted across the US and around the world, including in Australia, the UK, Norway, Germany and Canada.
Based on a compelling story of success, people copied the example again and again. However, as Matthew Syed points out in his excellent book ‘Black Box Thinking’, there was one problem with the intervention.
It didn’t work.
According to him, a more fitting label would have been ‘Scared Crooked’.
How can that be? Wasn’t there a decline in offending among those who took part?
The problem was that Scared Straight didn’t record what would have happened if there had been no intervention. A simple reduction in offending among participants is not enough to declare the programme a success. As teenagers mature, many naturally settle down and turn away from a life of crime. For the programme to be genuinely effective, it needs to decrease offending more than if the participants hadn’t taken part.
The real result
The first researcher to measure the genuine impact of the programme by comparing it to non-intervention was James Finckenauer of Rutgers University. In 1982, he randomised teens into two groups, some were admitted to the Scared Straight programme, others were kept out. His conclusion? That the reduction in crime was far less than the 80% touted by the documentary and — most damningly — less than among the control group.
His finding isn’t a one-off. A 2014 meta-analysis led by Anthony Petrosino, senior fellow at George Mason University, looked at 9 studies that followed a randomised design and included a total of 946 kids. It found that youths were between one times and five times more likely to commit a crime if they’d taken part in a Juvenile Awareness Program.
The evidence is now conclusive — these programmes are harmful rather than helpful — involvement increases crime and delinquency. The brutalising experience of the visit encouraged kids to commit more crime, perhaps by normalising criminalisation, or as reaction to attempted manipulation of their behaviour.
What can we learn from Scared Straight?
Scared Straight is an extreme example: a lauded success that turned out to be a mirage. But it offers salutary lessons for marketers.
First, just because something feels like it should work, doesn’t mean that it will. There was an assumption that a prison visit would shock kids off course from their path to crime.
But preconceptions can be dangerous. People have a nasty habit of treating evidence differently depending on whether it confirms or challenges gut feelings. Supporting evidence is accepted uncritically whereas contradictory evidence is held to a higher standard.
As marketers we need to be aware of this tendency and actively challenge our preconceptions. In the words of the philosopher Bertrand Russell, “In all affairs it’s a healthy thing now and then to hang a question mark on the things you have long taken for granted.”
But the second lesson from Sacred Straight highlights an even more pernicious problem. Often bad research is worse than no research. At least if we haven’t conducted research, we retain a degree of humility. After all, we know that we don’t know. So, we proceed with caution.
But bad research not only sends us off in the wrong direction with false insights, it also pumps us full of unjustified confidence. We continue along the mistaken path even longer because we’re so sure we’re right.
Here there’s a strong marketing parallel. Too many decisions are based on bad research, most obviously the claims made in focus groups and surveys.
These are misleading. As David Ogilvy said, “The trouble with market research is that people don’t think what they feel, they don’t say what they think, and they don’t do what they say.”
If you uncritically accepted focus groups' claims, you’d think that people were rational calculating machines weighing up every decision in excruciating detail. And assumptions like that lead to ineffective advertising.
The truth about Scared Straight didn’t come to light until the programme was tested more scrupulously. Finckenauer didn’t ask people whether the programme worked, he ran a rigorous test versus control experiment.
And that’s what marketers must do more of to uncover what’s working. Less listening to what people say influences them and more experiments uncovering what actually does.
Comments