I don’t think the potential difference between how much damage can be caused is a reasonable argument. After all, economic damages to writers from others copying, plagiarizing their work or style or world is limited not because it’s hard for humans to do so, but because we made it illegal to make something so similar to another person’s copyrighted work.
For example, Harry Potter has absolutely been copied to the extent legally allowed, but no one cares about any of those books because they’re not so similar that they affect the sales of Harry Potter at all. And that’s also true for AI. It doesn’t matter how closely it can replicate someone’s style or story if that replication can never be used or sold due to copyright infringement, which is already the case right now. Sure you can use it to generate thousands of books that are just different enough to not get struck down, but that wouldn’t affect the original book at all.
Now, to be fair, with art you can be more similar to others art, because of how art works. But also, to be fair, the art market was never about how good an artist was, it was about how expensive the rich people who bought your art wanted it to be for tax purposes. And I doubt AI art is valuable for that.
Except by that exact source, mass shootings with rifles are under reported and the deadliest mass shootings were done with semi automatic rifles.
“Since 1982, there has been a known total 65 mass shootings involving rifles, mostly semi-automatics. This figure is underreported though, as it excludes the multiple semi-automatic (and fully automatic) rifles used in the 2017 Las Vegas Strip massacre – the worst mass shooting in U.S. history, killing 58 and wounding 546. In fact, semi-automatic rifles were featured in four of the five deadliest mass shootings, being used in the Orlando nightclub massacre, Sandy Hook Elementary massacre and Texas First Baptist Church massacre.”