- rhizome parentYou are correct, sir!
- >Leaving AI to a handful of companies is in my opinion the fastest route for inequality and power concentration/centralisation
It's a homesteading land grab, plain, simple and pure.
There is a competitive landscape, first-mover advantages, incumbent effects, et al that are being anchored in e.g. Sam Altman's interests and desires at this very moment. If you want a vision of the future of garage AI, imagine a boot stomping on Preston Tucker's face over and over. The current AI industry's goal is to preserve an image of openness and benefit while getting ready to pull the ladder up when the time is right.
- I think the extremely most important aspect of AI security is for there to be a right to deinfluence.
There has to be way to rescind my contributions, whether they're voluntary or not (I still think these companies are dancing with copyright violations), as well as ALL DERIVATIVES of the contributions being removed that have been created via the processing of the creations in question, regardless of form or generation (derivatives of derivatives, etc.).
- IANAL, but from my understanding of the points at issue, I think a court might be likely to find that a) sucking the image into RAM is a copy in the first place; b) the FFT/etc. would be a (first) derivative work; c) using a form of the original image sufficient to communicate to the alteration processes what it should be altering would constitute a copy; and/or d) identifying something as a de-copyrighted work will undercut any defenses.
Here's an interesting answer apropos to all this: https://opensource.stackexchange.com/questions/7250/could-i-...