Hotel 626 was “fun” until the phone rang

A couple of weeks ago, I was sitting on the couch with my son, letting Youtube autoplay its way into the stranger corners of the internet, when we landed on a video from the channel Visual Venture. It was a video titled “Kids Games Too Disturbing For Kids” which included a piece on a sinister game from 2008 called Hotel 626.
He laughed at the grainy graphics and theatrical jump scares. Don’t judge him, he was born in 2013 and never experienced graphics like we did. I sat there thinking, this was the kind of lead magnet marketers could only dream of. The campaign, focused on bringing two flavors back from the dead, reportedly cost around $500,000… a hefty budget for a snack promotion, but in hindsight, they got a lot of data for their money.

The game that played you back
Hotel 626 was a Doritos promotion that ironically didn’t feature chips at all. Instead, you “checked in” by handing over your name, email, and date of birth. The game then prompted you for webcam and microphone access, pitched as “making the experience scarier”. And in its final moments, if you were in the United States, you could enter your phone number to receive a live call with whispered instructions on how to escape. I mean, W*&… that would have given me many sleepless nights.
The 2009 sequel, yes, they did it again, Asylum 626, went even further. This time, you could log in with Facebook or Twitter, and the game would weave your social graph into the horror, even posting content on your behalf. You read that correctly. “The more access you grant”, the experience promised, “the more intense the game”.
One section of the 2011 FTC complaint (which is no longer available online) against Doritos’ parent company, PepsiCo/Frito-Lay, described this approach bluntly:
“The site encourages teens to provide personal information and access to their social networks in order to enhance gameplay.”
From a marketing standpoint, it was precisely what the campaign playbook preached at the time: “Make the experience personal by harnessing the potential of media and technology.” And they did. Brilliantly, and maybe a little too well.
Activation without guardrails
Looking back at it now, it’s hard to overstate how much was packed into that one campaign. It captured identifiers, behavioural data, device information, and, in the sequel, social graph data. It blurred the line between entertainment and surveillance, all wrapped in the thrill of a late-night horror story.
The previously mentioned FTC complaint also noted:
“Visitors are not provided with clear notice of how their information will be used, nor is parental consent obtained for those under age.”
Back then, this was framed as imaginative targeting, stating the obvious that “really understanding your target audience and creating imaginative content is a critical element in any marketing campaign.” And it did just that, often by leaning on persuasion triggers that Robert Cialdini would recognise instantly. Scarcity in the form of a 6 pm–6 am play window, commitment once you’d already shared some personal details, and even a dose of authority when the game told you what to do next. It was textbook influence, just wrapped in a haunted hotel lobby.
Then versus Now
Back in 2008, Hotel 626 only opened its doors between 6 pm and 6 am, asking for your name, email, date of birth, and sometimes your phone number, without specifying how long this information would be kept. Today, that would mean explicit consent, a clear purpose, and a disclosed retention period. Well, if you are doing it right.
Webcam and microphone access came bundled into the experience. Remember, we are talking about an online game. Now, those would need to be optional, requested only when necessary, and explained in the moment.
The sequel’s Facebook and Twitter logins gave posting permissions and access to your social graph with little warning. It wasn’t exactly a Cambridge Analytica moment, but you could see how the same mechanics, data pulled in under the guise of fun, could become a problem in less playful hands. Now, surprise posting is a non-starter, and social access needs a separate, transparent opt-in.
Its teen appeal came with no parental consent flow. Today, COPPA in the US and GDPR Article 8 in the EU demand age verification and extra safeguards.
Even the “more access makes it scarier” mechanic, celebrated in 2009, would now be seen as a dark pattern. And that mysterious phone call? These days you’d have to tick a box, read a privacy notice, and confirm your number twice. By the time it rang, the only thing left to fear would be missing your next meeting or, in case you're still living at home, your mother calling you to dinner for the 5th time.
Which leaves us with the bigger question:
If every twist and turn in a modern activation has to survive a compliance obstacle course, how far can we really take it before the spark that made it memorable burns out?
The edges haven’t disappeared
What kept me thinking about the video the most after watching it was how familiar the instincts still are. Brands still push the edges of what’s permissible. They’ve just traded the overt creepiness of a phone ringing in the dark for quieter, more pervasive tactics. Now the intrusion might take the form of an ad appearing in your feed right after you’ve had a private conversation about the topic. Most platforms deny “listening in,” yet whether it’s device microphones, aggressive contextual targeting, or probabilistic identity stitching, the effect is the same → it feels like someone is in the room.
Some practices take it even further, like building shadow profiles for people who never signed up to a service, inferring interests and connections from the data of friends or family. Others use precise location data to quietly trigger ads based on where you’ve been, or tap into vast broker networks to enrich customer profiles with details the individual never knowingly provided.
I agree, it’s less theatrical than a horror game calling your phone, but the reach, subtlety, and deniability make it far more pervasive and far harder for people to recognise when the line has been crossed. And with “going digital” having “lowered the barriers to entry for creating compelling, tailored and interactive content significantly”, it’s not just global brands who can play at these edges anymore… almost anyone can.
Where’s your line?
We live in a world of privacy laws, compliance frameworks, and regulatory watchlists. A modern version of Hotel 626 would be hard to build with the prevalent presence of AI without triggering multiple legal tripwires. But that doesn’t mean the temptation is gone.
With generative AI, a “Hotel 626” moment could be rebuilt in weeks, not months, and for a fraction of that $500k price tag. The question is whether marketers will check in for the creativity, or be checked out for the creepiness.
Thanks for reading Martech Therapy! Subscribe for free to receive new posts and support my work.
Member discussion