AIAmazonDecoderPodcastsPrivacyTech

Let’s talk about Ring, lost dogs, and the surveillance state

Let’s talk about Ring, lost dogs, and the surveillance state

Let's talk about Ring, lost dogs, and the surveillance state

Let’s talk about Ring, lost dogs, and the surveillance state

Today, let’s talk about the camera company Ring, lost dogs, and the surveillance state. 

You probably saw this ad during the Super Bowl a couple of weekends ago:

Since it aired for a massive audience at the Super Bowl, Ring’s Search commercial has become a lightning rod for controversy — it’s easy to see how the same technology that can find lost dogs can be used to find people, and then used to invade our privacy in all kinds of uncomfortable ways, by cops and regular people alike.

Ring in particular has always been proud of its cooperation with law enforcement. That raises big questions about our civil rights, especially since Ring announced a partnership last fall with a company called Flock Safety, whose systems have been accessed by ICE. There’s some complication to that — we’ll come back to it in a bit.

The backlash to Ring’s Super Bowl ad was swift, intense, and effective: the data company PeakMetrics says conversation about the ad on social platforms like X actually peaked two days after the Super Bowl, and the vibes, as they measured them, were strikingly negative. I mean, you know it’s bad when Matt Nelson, who runs the weratedogs account, is posting videos like this:

Sen. Ed Markey called the ad “dystopian” and said it was proof Amazon, which owns Ring, needed to cease all facial recognition technology on Ring doorbells. He said, “This definitely isn’t about dogs — it’s about mass surveillance.”

And then, on Thursday, February 12th, just four days after the Super Bowl, Ring announced it was canceling its partnership with Flock, in a statement first reported by The Verge‘s Jen Tuohy. That statement itself is a lot:

Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. As a result, we have made the joint decision to cancel the planned integration. The integration never launched, so no Ring customer videos were ever sent to Flock Safety. 

The company also goes on to say that Ring cameras were used by police in identifying a school shooter at Brown University in December 2025. It’s an odd non sequitur in a press release about canceling a controversial partnership that really explains a lot about Ring, and how the company sees itself. 

As it happens, Ring’s founder Jamie Siminoff was just on Decoder a few months ago, talking about how and why he founded the company, and in detail about why he sees Ring’s mission as eliminating crime. Not selling cameras or doorbells, or floodlights, or anything else Ring makes, but getting rid of crime.

We actually talked about Search Party and how people might feel about that kind of surveillance, and how Ring works with the cops quite a bit. In fact, Jamie briefly left Ring in 2023, and the company slowed down on its work with law enforcement. But ever since he returned, the emphasis on crime and the work with police has only intensified. I asked him about it:

NILAY PATEL: You left, Amazon said we’re going to stop working with police, you came back, boy, Ring is going to work with police again. You have a partnership with Axon, which makes the taser, that allows law enforcement to get access to Ring footage. Did that feel like a two-way door? They made the wrong decision in your absence, and you came back and said, “We’re going to do this again”?

JAMIE SIMINOFF: I don’t know if it’s wrong or right, but I think different leadership does different things. I do believe that I spent a lot of time going on ride-alongs. I spent a lot of time in areas that I’d say are not safe for those people, and I’ve seen a lot of things where I think we can positively impact them. So, we don’t work with police in the way of … I just want to be careful, as we’re not … What we do allow is for agencies to ask for footage when something happens. We allow our neighbors, which I’ll say in this point are our customers, just to be clear, we allow our customers to anonymously decide whether or not they want to partake in that.

So, if they decide they don’t want to be part of this network and don’t want to help this public service agency that asks them, they just say no. If they decide that they do want to, which, by the way, a lot of people want to increase the security of their neighborhoods. A lot of people want their kids to grow up in safer neighborhoods, a lot of people want to have the tools to do that, and are in places that are dangerous. We give them the ability to say yes and make it more efficient for them to communicate with those public service agencies, and also do it in a very auditable digital format.

That’s the other side. Today, without these tools, if a police officer wanted to go and get footage from something, they’d have to go and knock on the door and ask you, and that’s not comfortable for anyone. There’s no digital audit trail of it, and, with this, they can do it efficiently with an audit trail. It is very clear, and it’s anonymous. 

Jamie actually talked a lot about searching for dogs in this context, because one of the reasons he was so excited to come back to Ring was to use AI to search through the massive amounts of video generated by Ring cameras. In fact, he told me that Ring could not have built Search Party five years ago, because AI systems to do it weren’t available. 

Jamie is nothing if not direct about this, which I appreciate. The man really thinks you can use AI and cameras to reduce or even eliminate crime. But I had a lot of questions about this:

JAIME SIMINOFF: But when you put AI into it, now, all of a sudden, you have this human element that AI gives you. I think, with our products in neighborhoods and, again, you have to be a little bit specific to it, I do see a path where we can actually start to take down crime in a neighborhood to call it close to zero. And I even said, there are some crimes that you can’t stop, of course.

NILAY PATEL: Mechanically, walk people through what you mean. You put enough Ring products in a neighborhood, and then AI does what to them that helps you get closer to the mission of zeroing out crime?

So, the mental model, or how I look at it, is that AI allows to have … If you had a neighborhood where you had unlimited resources, so every house had security guards and those security guards were people that worked the same house for 10 years or 20 years, and I mean that from a knowledge perspective. So, the knowledge they had of that house was extreme; they knew everything about you and that residence and your family, how you lived, the people that came in and out.

And then, if that neighborhood had an HOA with, call it private security, and those private security were also around and knew everything, what would happen? When a gets lost, you’d be like, “Oh, my gosh, my dog is lost.” Well, they would call each other, and one of them would find the dog very quickly. So, how do we change that and bring that into the digital world is—

Can I just ask you a question about that neighborhood specifically?

Sure.

Do you ever stop and consider that that neighborhood might suck? Just the idea that every house on my street would have all-knowing private security guards, and I would have an HOA, and that HOA would have a private security force.

You can easily paint that as dystopia. Everyone’s so afraid that we have private cops on every corner, and I’m paying HOA fees, which is just a nightmare of its own.

So, I would assume you live in a safe neighborhood.

I hope so, yeah.

No, today, I’d go to … If you want, I’ll take you to a place where people live and have to, when they get home from school, lock their doors and stay in their house, and they can’t go out and—

But I’m just saying that that model is “everybody is so afraid that they have private cops.”

I think the model is that doing crime in a neighborhood like that is not profitable, and I think that you want people to move into another . I don’t think that crime is a good thing and so I think … But listen, it certainly is an argument to have, I do believe that … I think safer neighborhoods allow for kids to grow up in a better environment and I think that allows them to be able to focus on the things that matter and so that’s what we’re going for.

I just wanted to challenge the premise.

I think it’s a fair challenge.

The model is that there are cops everywhere. That level of privacy.

Yeah, it’s not cops. I think it’s more that you’ll have the ability to understand what’s happening. It’s not like … But yeah, I think, listen, it’s a fair statement, I guess. I think I want to live in a safe place.

There’s a lot of intelligence in your neighborhood, and maybe it’s private security, maybe it’s not. What does the AI do? Does it just make the camera smarter? It lets you do a more intelligent assessment of what the cameras are seeing?

Right now, we just say motion detection, motion detection, motion detection. It’s funny, when I started Ring… The book was fun because I got to go back and actually go through this whole story of how this thing came to be, and motion detection was an amazing invention. You’re in the airport, and there’s a motion at your front door, and you look at it like, “Wow, this is crazy.”

Now, with AI, we shouldn’t be telling you about motion detection; we should be telling you what’s there, when you should look at it, when it matters, and we shouldn’t be bothering you all the time. That’s what I mean by this idea of these security guards at your house or in your neighborhood. There should be this intelligence in your neighborhood that can tell you when you should be trying to be part of something, but not always tell you. So, it’s not just like, “Car, car, dog, person, person.” It’s like, “Hey, look at this. You want to pay attention to this right now.”

I really pressed Jamie on this because I still don’t think it is entirely clear how Ring accomplishes the elimination of crime through AI alone. And it’s why people don’t trust the company when it says it won’t use systems that can find a dog to do things that otherwise violate our rights. After all, if your goal is to use AI to stop crime, and you built an AI system that can find a dog… well, it’s pretty obvious what comes next, right?

NILAY PATEL: Do you think when you talk about zero out crime in a neighborhood, the idea that everyone in a neighborhood has one of those illuminated Ring signs in the front yard, is that enough to

JAMIE SIMINOFF: It’s a part of it.

Is that just enough of a deterrent? The bad guys will know their face is going to be captured on video, and that will be analyzed by an AI, and something will happen. Do you have to do more outbound deterrents?

I think that’s a part of it. Awareness is a big part of it. I think there are ways with lights also, using lighting to do stuff, that’s a big part of it. I think having just … If, all of a sudden, someone comes outside because something’s an anomaly, that’s a big part of it. It doesn’t have to be some crazy thing. And that’s what I was saying, is a lot of these little things add up to make that work.

So, when you think about it, okay, we can bring crime down in a neighborhood to close to zero in a neighborhood, what are the ratcheting steps? Does everyone just get the Ring camera, and your platform does all the work? Is it that someone gets caught and they tell all their friends in jail that they got caught? What are the steps?

I think it’s really about bringing neighbors together for this particular thing. So, it’s about how you individually… and we’ve always thought about how each house is its own node controlled by the neighbors, so controlled by the person, and I’ll keep going back to that, which is one hundred percent, your video is in your control; everything you’re doing is in your control, whether you want to take part in anything is in your control. That has to be the first layer of all of it. 

But then, when something happens, do you want to take part in it? So, if you get an alert that this dog looks like the dog that’s in front of your house, can you contact your neighbor? You can decide not to take part in it, and then no one will ever know, and it’s fine, it’s just basically deleted, or you can take part in it. I think that’s how we can do things that can make a neighborhood into this node where individual neighbors are all on their own, but when things happen, they can work together as they want to.

And you think that AI will accelerate the process?

I think AI is a co-pilot. It is their assistant, and it’s helping them to figure this out. Because, again, if you’re just getting every motion alert, and if you have eight cameras and you’re just getting motion alerts all day, no human being can parse all this data. So that’s what I was talking to Jen about, is that I do think I see a way to use AI to help feed better data to us, which allows us to make better decisions and work together better.

Let’s talk about Ring, lost dogs, and the surveillance state

This is where we get to Flock, which Ring announced a partnership with in October 2025. Flock primarily makes cameras and systems to search video for the cops. You’ve probably seen Flock’s devices where you live — they’re those little solar-powered cameras and tracking devices affixed to streetlights or placed in the center of parking lots. They vacuum up large amounts of data, and the company claims it is anonymized before it’s made available to partners, which in most cases is local law enforcement.

However, according to in-depth reporting from the excellent 404 Media, Flock’s data has often found its way to ICE, the FBI, the Secret Service, and other law enforcement agencies, without the requirement of a warrant. That’s because that data is willingly provided by local police.

Last month, under intense scrutiny about what a deal with Flock would mean, Ring said that this partnership was not yet “live,” and that, “Ring has no partnership with ICE, does not give ICE videos, feeds, or back-end access, and does not share video with them.” For its part, Flock says the same — that it doesn’t actually work with ICE, but rather local law enforcement, and it’s those local agencies that work with ICE. This is the complication I mentioned earlier.

If you’re a Decoder listener, you know where this is all going. I asked Jamie about all these databases, who owns them, and what it means to connect them all with AI:

NILAY PATEL: But when you connect a bunch of those databases, particularly to facial recognition, there’s a turn in the privacy conversation where the stakes ratchet up really high, where maybe it’s gone forever.

How are you thinking about that decision-making? Okay, we have a lot of intelligence in the AI; it’s trivial for the AI to connect to another store of information. That’s a thing you can do with AI, especially at a big company like Amazon, where you have lots of other stores of information. There’s a line, what’s the line for you?

JAMIE SIMINOFF: There is a responsibility, obviously just to build safe products. So let’s just start with that. Yeah, we did announce facial, we call it Familiar Faces, but that’s not connected, that’s just for your… Your iPhone today. If you search your iPhone, it’s crazy. Search for someone’s name in your photos, and their pictures come up.

So I do think there’s a balance between not allowing technology to exist that should exist that helps people and gives them more efficiency, gives them safer homes and then also, obviously, not creating this dystopian place. And so, I think that’s the responsibility, but what we’re doing with Familiar Faces is we’re just giving you the ability to say, when my wife comes home, don’t… Because it is silly. Why do I get an alert when my wife comes home? I don’t want it, I don’t need it.

I’m asking this for a lot of reasons, but I look at what’s broadly happening with surveillance footage out in the world. And I’m not saying Ring is participating in this, I’m just giving you an example. ICE has facial recognition systems, and they are arguing that a positive match in their facial recognition system is a definitive determination of someone’s immigration status. That’s way out there. I don’t think you’re doing that.

But you can get to, “Okay, we have facial recognition, we have a bunch of evidence coming off of Ring cameras, to make it really safe, you want to go from passive surveillance to active surveillance. That’s what the studies show. Now the camera will literally identify the criminal by face and tell the cops this person tried to steal a car from this driveway,” and that’s the thing that would get you to actually zero out crime.

There’s a lot of risk in those steps. But if I draw the thread from what you’re saying, it’s all the way to the idea that the criminals won’t come here because the cameras will know who they are and tell the cops. Are you willing to go that far?

I think it’s also that the cameras will alert people. Part of what made Ring and what made neighbors safer with Ring 1.0, and I think we are in Ring 2.0, is that there was no presence at the home. How did people break into homes? They would go and be knock-knock burglars. They would knock-knock, no one was home. It was 3PM, they’d go to the homes next door, find a place that was empty, and they’d go into the home.

Ring allowed you to, now, all of a sudden, when someone comes up to the door, you’re like, “Oh, I got a motion alert. Hi, what’s going on?” and so it gave a presence to the home. So, I don’t think you have to go as far as that real time stuff to get to where we’re talking about, I think it’s more of the anomaly detection and allowing people to make it so that, if someone comes in, that you’re aware of what’s happening around the neighborhood because right now there’s no awareness of what’s going on around it.

So I don’t think it’s as dystopian as where you’re going, and certainly it’s not what we’re building, and I do think we can impact things to a really high level in neighborhoods. Which, again, to the Jen Tuohy thing, in neighborhoods is what we were talking about, that with AI and what we’re doing with a bunch of Rings together. I think even the Dog Search Party is a good way to look at it, which is how these cameras come together for good in the neighborhood.

Today, the backlash against surveillance has led Ring to kill that Flock deal. The company is also doing damage control about Search Party, telling The Verge that the technology that powers the feature is not “capable” of being used to find people, and that there’s no indication that such features are on future road maps.

Sure, it seems obvious that Ring has a lot of trust to earn back here, and certainly, we are all thinking about what it means to put internet-connected security cameras in our homes. I think that’s good — and certainly overdue. 

But let me complicate this a little for you. At the same time this conversation about Ring is happening, we’re also watching regular people record the police and ICE with their cell phones, capturing critical documentary evidence of how those agencies are violating the rights of everyday Americans in ways that is leading to change, however slow.

Minnesota governor Tim Walz is now telling people to hit record when they see ICE so the footage can be used in future prosecutions. And just last week, the FBI released video that Google appears to have specifically recovered from a Nest camera system at Nancy Guthrie’s house, in order to help identify her kidnapper. 

That’s a lot of video that’s being captured and used in ways that maybe don’t feel so invasive, and in some cases even feels good. But the systems that create, store, and share that video are all the same, and the guardrails around them are just as weak as whatever makes us feel uncomfortable about Ring.

I’m not sure how to feel about all of that video, especially in a world where AI makes it easy to fake, and having some source of truth seems more important than ever. I asked Jamie about that, too — whether Ring will control all the video, or sign it with some metadata so people can ensure it’s real. 

NILAY PATEL: Presuming we have to have an authenticated server, there’s a crime in my neighborhood, and I’ve opted in, and we’re going to say the cops can only get the video from the Ring server, where we know it’s true. I might not be as in control of my video anymore.

JAMIE SIMINOFF: No, not how it’s built and not while I’m here because the way it works is that you will decide if you want to or not want to share that video, which is your property, with someone. Now, once you share it, then it is up to us to figure out, to your point, how do we share it, how do we make sure that the digital fingerprint goes all the way through, or how does the chain of custody work of this video to make sure there’s no fake in the process of it? I think this is why it is important to build these systems.

It’s going to be important, though. This is also where the government is going to have to step in. We’re going to have to deal with this across the board because we also have video coming off of cell phones. So, we do need to figure out how to build… And there’s going to be companies, Axon would probably be one of the companies. I don’t want to speak for them, but they have evidence.com, so to build these evidentiary systems to take in…

Because Ring is one part of taking in data around, call it a crime scene, but cell phone video is maybe even more of a source today. So, how do you take that in? How do you make sure that it actually was captured on the iPhone directly and not tampered with between the two things? We’re going to have to figure it all out. I think we have to work together on it, and the AI stuff is pushing us to do it. I am proud that with Ring, we have built it so that you can take it directly and keep it on the server. You can understand where it was, where it’s from, where it was created, and we have that digital fingerprint on it and the audit trail of it.

You’re going to have to do that more and more as this world is changing, you’re just not going to be able to trust that just because someone sends you a video doesn’t mean it’s true.

You get the feeling we’ll be coming back to that idea quite a lot in the years to come. 

But today Ring has canceled its deal with Flock, and Flock itself is putting out blog posts flatly stating it does not have a contract with ICE, and noting that Ring’s other partner, Axon, does in fact have an ICE contract. 

In the meantime, Search Party is still active and on by default, although you can just go into settings and flip it off. And the enormous amount of video we all generate is being uploaded to servers run by big companies that have their own dealings with governments and law enforcement agencies far outside of our control. 

There are a lot of solutions to all these problems, and lots of ways to regulations to balance out privacy and civil liberties with the needs of police. But right now, in 2026 America, I’m not sure were really going to be able to do that.

So we’re going to keep pushing the leaders of these companies on what they really mean, and keep running the answers so you can listen and decide. I think it’s about time we started thinking about how all the technology we use to make our own lives better affects other people. 

Because that bigger conversation we need to have? Yeah, that’s what it’s really about.

Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!