Preferences

caconym_
Joined 5,751 karma
I am a professional software engineer and amateur fiction writer. I work for $CORPORATION. All views expressed here are my own.

  1. Private entities surveil you to make money off you or protect their property. Law enforcement surveils you to arrest you and charge you with crimes. These are not the same, and that's why some people care more about surveillance by law enforcement.

    As an example, see the recent case of the woman who was arrested simply for driving through a town at the same time as a robbery occurred. That sort of thing is why people care.

    If the data collection is performed by a private entity and then sold to the government, that is government surveillance. I agree that this is more widespread than Flock and other big names. However, Flock and its ilk currently stand to do far more damage in practice. They offer integrated turnkey solutions that are available to practically any law enforcement, from shithead chud officers in tiny shithole towns to the NYPD and all its grand history of institutionalized misconduct, and we are already seeing the effects of that.

    See, also, the recent case of a teenager who was arrested because a Flock camera or similar thought a Doritos bag in his pocket was a gun. I'll let you guess what color his skin was.

  2. Conjecture on the functional similarities between LLMs and humans isn't relevant here, nor are sophomoric musings on the nature of originality in creative endeavors. LLMs are software products whose creation involves the unauthorized reproduction, storage, and transformation of countless copyright-protected works—all problematic, even if we ignore the potential for infringing outputs—and it is simple to argue that, as a commercial application whose creators openly tout their potential to displace human creators, LLMs fail all four fair use "tests".
  3. > you shouldn't have given away your work for free.

    Almost none of the original work I've ever posted online has been "given away for free", because it was protected by copyright law that AI companies are brazenly ignoring, except where they make huge deals with megacorporations (eg openai and disney) because they do in fact know what they're doing is not fair use. That's true whether or not I posted it in a context where I expected compensation.

  4. They are welcome to do so! I encourage anyone who finds this scraping problem interesting to approach it in whatever way they find the most rewarding!
  5. True if you think the images have no value, nor the time I saved by "outsourcing" the work, but writing the kind of trivial web scraper I've written N times before somehow does.

    Personally I would disagree!

  6. I don't know what he did, but I gave gemini-cli the url and asked for a script. The LLMs are pretty good at this sort of simple but tedious implementation.
  7. Releasing anything as "GPT-6" which doesn't provide a generational leap in performance would be a PR nightmare for them, especially after the underwhelming release of GPT-5.

    I don't think it really matters what's under the hood. People expect model "versions" to be indexed on performance.

  8. People who believe in baseless conspiracy theories have to convince themselves that people who don't are operating in the same epistemic mode, picking and choosing what to believe in order to reinforce their prior beliefs, because the alternative is admitting that those people are operating in a superior epistemic mode where they base their beliefs on most or all of the available evidence (including, in this case, the fact that the """vaxxed""" people they know are all still upright and apparently unharmed after years of predictions to the contrary).

    Your comment is a manifestation of this defense mechanism. As real evidence piles up that you've been wrong, you retreat into these bizarre imaginary scenarios in which you've been right the whole time, and by projecting that scenario onto others you imagine yourself vindicated. But the rest of us just think you're nuts.

  9. > Be curious to hear what specifically your company is doing to give people agency.

    Wrt. AI specifically, I guess we are simply a) not using AI as an excuse to lay off scores of employees (at least, not yet) and b) not squeezing the employees who remain with arbitrary requirements that they use shitty AI tools in their work. More generally, participation in design work and independent execution are encouraged at all levels. At least in my part of the company, there simply isn't the same kind of miserable, paranoid atmosphere I hear about at MS and Amazon these days. I am not aware of any rigidly enforced quota for PIPing people. Etc.

    Generally, it feels like our leadership isn't afflicted with the same kind of desperate FOMO fever other SMEGMAs are suffering from. Of course, I don't mean to imply there haven't been layoffs in the post free money era, or that some people don't end up on shitty teams with bad managers who make them miserable, or that there isn't the usual corporate bullshit, etc.

  10. > Do you think those non-techies are sympathetic to the Microsofties and Amazonians?

    As somebody who has lived in Seattle for over 20 years and spent about 1/3 of it working in big tech (but not either of those companies), no, I don't really think so. There is a lot of resentment, for the same reasons as everywhere else: a substantial big tech presence puts anyone who can't get on the train at a significant economic disadvantage.

  11. It kinda seems like you're conflating Microsoft with Seattle in general. From the outside, what you say about Microsoft specifically seems to be 100% true: their leadership has gone fucking nuts and their irrational AI obsession is putting stifling pressure on leaf level employees. They seem convinced that their human workforce is now a temporary inconvenience. But is this representative of Seattle tech as a whole? I'm not sure. True, morale at Amazon is likely also suffering due to recent layoffs that were at least partly blamed on AI.

    Anecdotally, I work at a different FAANMG+whatever company in Seattle that I feel has actually done a pretty good job with AI internally: providing tools that we aren't forced to use (i.e. they add selectable functionality without disrupting existing workflows), not tying ratings/comp to AI usage (seriously how fucking stupid are they over in Redmond?), and generally letting adoption proceed organically. The result is that people have room to experiment with it and actually use it where it adds real value, which is a nonzero but frankly much narrower slice than a lot of """technologists""" and """thought leaders""" are telling us.

    Maybe since Microsoft and Amazon are the lion's share (are they?) of big tech employment in Seattle, your point stands. But I think you could present it with a bit of a broader view, though of course that would require more research on your part.

    Also, I'd be shocked if there wasn't a serious groundswell of anti-AI sentiment in SF and everywhere else with a significant tech industry presence. I suspect you are suffering from a bit of bias due to running in differently-aligned circles in SF vs. Seattle.

  12. > I think this is a pretty black and white and simple view of things, fault is not always 100% clear, and CLAIMING fault is different from explaining what happened _from your perspective_, and letting the other driver do the same. But I'm not actually speaking about simple fault in a basic traffic collision.

    Seems like having video (and GPS speed, etc.) can only make it clearer who (which may include both parties) is at fault? I still don't see how that can be a bad thing if you also aren't interested in lying about what happened.

    > I think there's a huge asymmetry between the upside of the dash cam and the downside of self-surveillance.

    I almost addressed the generalized surveillance angle in my original comment, but didn't since it seemed that your comment was focused exclusively on the context of having been in a traffic collision.

    Addressing it now, I guess I am just not too worried about this angle when my dashcam simply records videos onto an SD card that I have complete control over. If I was a person likely to be targeted by my authoritarian government, I would probably think twice about having such an unencrypted SD card sitting around where it might be swept up in a bogus search and used to gin up additional bogus charges against me, but that is currently not my situation. Really, I can only imagine the video evidence collected by my dashcam being used to exonerate me in a scenario like the one you describe, e.g. if an LPR tagged me on the block where the murder happened but my dashcam clearly showed that I was just passing through.

    In fact, this exact thing recently happened (https://www.cbsnews.com/colorado/news/flock-cameras-lead-col...) to a woman who was falsely accused of theft based on LPR data and used her Rivian's dashcam recordings (among other data) to get the police to drop the charges. It's insane that this happened in the first place, but that's beside the point here.

    Of course, people using cloud-based dashcams are certainly exposing themselves to dragnet surveillance—which I do have a problem with simply on principle—but the data on my dashcam's SD card are fundamentally inaccessible to law enforcement until they obtain it in a physical search of my car.

  13. If you believe you are at fault in a collision where police, insurance, etc. are involved, they are going to ask for your statement, and at that point you will be forced to choose between lying or admitting fault. If you're glad that no dashcam footage exists, presumably you are going to lie about what happened! I don't see why this is any different than popping the SD card out of your dashcam and lying about that too—you're still lying, and for the same reason: to evade responsibility for a collision you caused.
  14. Why do you think potentially self-incriminating self-surveillance is "crazy" when you also think lying to the cops and other involved parties about what happened is bad? If you believe it's important to tell the truth in these situations, you should have no problem providing your own recordings of a collision, regardless of who is at fault.

    Or is your point just about the cost of the dashcam being "crazy"? In that case, hypothetically, what if your insurance company cut you a check to buy a dashcam of your own choice and install it on your car?

  15. The US has antivaxxers in charge of health policy now, and they have specifically targeted mRNA vaccines with funding cuts. They seem likely to hinder rather than help any near future vaccines development program in response to a pandemic.
  16. I explicitly addressed this in my comment.
  17. The issue is specific to Apple! IIUC they're the only mainstream cloud storage provider that provides E2EE, and I'm sure many of their customers chose them over their competitors for that reason.
  18. The issue is with Apple specifically in the sense that they have been offering a superior E2EE cloud storage service that will soon be denied to UK residents (IIUC, E2EE isn't offered by their competition e.g. Google, Microsoft). But the article goes out of its way in its first section to note that Apple isn't in the wrong at all here:

    > But I will say that the shutdown of ADP is Apple being on the right side of the geopolitical fight, as inconvenient as that may be to you and me.

    It is, if you care about the issues the author evidently cares about, "time to start de-Appling". I am a satisfied ongoing customer of Apple and I didn't find this headline to be the least bit inflammatory. It is, at worst, minor clickbait—but it's not really bait at all, since the contents of the article match the headline.

  19. E2EE cloud storage is not some kind of magic that only tech bigcorps can provide. I de-Dropboxed a few years ago, replacing it with Syncthing running on a local NAS with e2ee backups in Backblaze and Wireguard VPN out to my mobile devices. Sure, this is not the sort of thing most people can set up for themselves, but I don't think that's particularly relevant in context.
  20. > So, a UK-only advice

    So what?

    > it strangely assumes that any other service in UK wouldn’t be bound by the same laws.

    From the linked article:

    > I’m not going to tell you where to move your stuff other than to say that if you’re moving it from one big tech company to another, you’re just being daft. Likewise, if you’re moving your stuff to a non-e2ee service, don’t bother. If you need an e2ee service try Proton. They have a Black Friday sale on.

This user hasn’t submitted anything.