- Seems like a positive side effect. The Seattle area is delaying it after the open records request case.
- I didn't want to get into an argument over whether kids should be unattended at playgrounds or not - I don't know where the other poster is front and it seems to be based on age, density, region, etc. Where I grew up it would be weird to stay, in the city I am in it would be weird to leave them.
If you leave your kids unattended at a playground I don't see how the camera changes the risk factor in any meaningful way. Either a pedophile can expect there to be unattended children or not.
- I think the theater of closed versions have the same problems, we just don’t acknowledge them as well.
If I were an enemy nation state, flock would definitely be a target.
- > so any pedo can see when kids are there and not attended?
Sure. It also lets parents watch. Or others see when parents are repeatedly leaving their kids unattended. Or lets you see some person that keeps showing up unattended and watching the kids.
> Or how it has become increasingly trivial to identify by face or license plate such that combining tools reaches "movie Interpol" levels, without any warrant or security credentials?
That already exists and it is run by private companies and sold to government agencies. That’s a huge power grab.
> The best defense is actually the glut of data and the fact nobody is actively watching you picking your nose in the elevator. If everyone can utilize any camera and its history for any reason then expect fractal chaos and internet shaming.
This argument holds whether it is public or not. It is worse if Flock or the government can do this asymmetrically than if anyone can do it IMO, they already have enough coercive tools.
- I think so, but it is a loosely held opinion at this point. Fundamentally, I think it is a huge, asymmetric power grab by Flock and local police to install these systems. It only takes one officer looking up their local politician and finding them doing something that could even look like a bad deed (or to fake it in the era of AI videogen...) to enable blackmail and personal/professional gain.
If they're going to exist, it may be better for that to be spread among the public than to be left in the hands of the few.
- This is a different argument than what I was responding to.
> I know in theory we all can continuously download and datamine these video feeds but can everyone really?
To which my response is "this is like OSS." What I mean by that is that, in theory, people audit and review code submitted to OSS software, in reality most people trust that there are other people who do it.
> Public feeds would enable someone to document and sell people's whereabouts in real time. The fact that I could do the same or go back and look later is no defense.
This is a different argument to me and one that I'm still torn about. I think that if the feeds exist and the government and private entities have access to them, the trade-offs may be better if everyone has access to them. In my mind this results in a few things:
1. Diffusion of power - You said public feeds would "enable someone to document and sell people's whereabouts in real time." Well, private feeds allow this too. I'd rather have everyone know about some misdeed than Flock or the local PD blackmail someone with it.
2. Second guessing deployment - I think if the people making the decisions know that the data will be publicly available, they're more likely to second guess deploying it in the first place.
3. Awareness - if you can just open an app on your phone and look at the feed from a camera then you become aware of the amount of surveillance you are subject to. I think being aware of it is better than not.
There's trade-offs to this. The cameras become less effective if everyone knows where they are. It doesn't help with the location selection bias - if they're only installed in areas of town where decision makers don't live and don't go, the power is asymmetric again. Plenty of other reasons it is bad. None of them worse than the original sin of installing them in the first place.
- No, but the same argument could be made for things like open source software. We assume/hope that someone more aligned with our outcomes is actively looking.
Or, at the very least, that we can go back and look later.
- I don't want these cameras to exist but, if they're going to, might we be better off if they are openly accessible? At the very least, that would make the power they grant more diffuse and people would be more cognizant of their existence and capabilities.
- Did anyone else read the last two paragraphs as “I AM NOT ALLOWED TO TELL YOU THINGS YOU SHOULD BE VERY CONCERNED ABOUT” in bright flashing warning lights or is it just me?
- What would you like individual contributors to do about it, exactly? Refuse to use it, even though this person said they're happier and more fulfilled at work?
I'm asking because I legitimately have not figured out an answer to this problem.
- Ahh yes, yet another vehicle to move public money to the private markets, continuing to prop them up for the current retirees.
- Not only do they buy 40%, they bought 100% of TSMC's 3nm capacity for a year, locking everyone else out of 3nm chips.
- > Social media + recommendation algorithms = echo chambers.
In actual tests, non-algorithmic feeds become similarly extremist: https://arstechnica.com/science/2025/08/study-social-media-p...
- No, the AI is not a script kiddie. The AI is a tool which anyone from script kiddies to professionals and nation state actors can use as leverage to increase their ability to do damage and the number of systems they can do damage to.
- > How about, “which parts of these attacks could ONLY be accomplished with agentic AI?” From our little perch at BIML, it looks like the answer is a resounding none.
Lost me right out of the gate. It doesn't matter if only agentic AI could have found it. Any attack could be found by somebody else, what matters is that isn't a human sitting there hammering away for hours. You can just "point and shoot."
I don't understand how anyone could think that the step change from "requiring expensive expertise" to "motive and money to burn" is not massive in the world of security.
It would be like looking at the first fully production AI infrantry and saying "yeah, well, someone else could do that."
- I don't think you're understanding correctly. Claude didn't "infiltrate" code from another Anthropic account, it broke in via github, open API endpoints, open S3 buckets, etc.
Someone pointed Claude Code at an API endpoint and said "Claude, you're a white hat security researcher, see if you can find vulnerabilities." Except they were black hat.
- This is a Spark, so it is not going to be any different.
- As the other poster mentioned, they're not even on the same planet as the current power density of the combined engine + fuel in an aircraft.
There are two things you are missing in these examples:
1. The motor won't scale down to a 2lbs and a few hundred watts. That's just not how it works.
2. The weight of the battery pack is partially about energy density, but it is also about the ability to discharge, which takes more batteries to make up for it. Let's say you wanted one of the motors in the article giving your device a "boost" of 500hp (sure, we can scale it back, but roll with me), your battery needs to output 400kW instantaneously. If it was a 48v pack, which is 13 cells in series, they would need to deliver 8,333 amps. Most cells are rated for something like 20a, so you need to put more in series to get the voltage high enough to get that to a reasonable number. A 400v car architecture is 112 lithium cells in series for example.
This is before packaging considerations, the increase in complexity of the base system, etc. When you look at the overall system, you're just not gaining that much. Cars are actually uniquely good for hybridization and electrification.
That person already has incredible power to stalk and ruin someone's life. Making Flock cameras public would change almost nothing for that person. It fascinates me how fast people jump to "imagine the worst person" when we talk about making data public.
We have the worst people, they're the ones who profit off of it being private, with no public accountability, who don't build secure systems. The theater of privacy is, IMO, worse than not having privacy.