I’m old enough to remember when age verification bills were pitched as a way to ‘save the kids from porn’ and shield them from other vague dangers lurking in the digital world (like…“the transgender”). We have long cautioned about the dangers of these laws, and pointed out why they are likely to...
These ideas are all fundamentally misguided. Let’s take a step back what we are trying to do here: We want to create a system so that the government can withhold certain information from certain people. That’s both difficult and dangerous.
PornHub’s idea requires cooperation from the hosters. You are not likely to get global agreement on that. So you will still need to do something about those foreign sites, such as blocking them.
At that point, such a law would achieve 2 things:
Society has decided to create a technical censorship infrastructure.
Domestic porn providers have an incentive to support to it because it removes foreign competition.
Blocklists that parents can install on their devices already exist, so there would be no change in that regard.
Of course, minors have no trouble circumventing such software. They have plenty of time and they are horny. You can’t win. The only faint hope might be to include such features at deeper levels, similar to existing DRM schemes. This would be ripe for abuse by bad actors or governments. It certainly would be used against the consumer by the copyright industry and tech monopolies; just like existing DRM schemes.
So we really should ask why we would want to walk further down this expensive, hostile, and dangerous path. Are we afraid that masturbation causes blindness?
Government in this case is forcing sites to collect PII to verify age not blocking content not blocking content themselves.
I am working under the knowledge that these age verifications are not theoretical (Its the end game of all the KYC startups from last decade)
If you are in the south in much of the US these ID checks are already forced and will only expand
A browser header gives the result without building a Database of people who like porn
Browser headers also put the responsibility on sites that promote dangerous things to kids (its in your best interest as a site that can deliver porn, things not suitable for kids to check and respect the header from a liability perspective)
These ideas are all fundamentally misguided. Let’s take a step back what we are trying to do here: We want to create a system so that the government can withhold certain information from certain people. That’s both difficult and dangerous.
PornHub’s idea requires cooperation from the hosters. You are not likely to get global agreement on that. So you will still need to do something about those foreign sites, such as blocking them.
At that point, such a law would achieve 2 things:
Blocklists that parents can install on their devices already exist, so there would be no change in that regard.
Of course, minors have no trouble circumventing such software. They have plenty of time and they are horny. You can’t win. The only faint hope might be to include such features at deeper levels, similar to existing DRM schemes. This would be ripe for abuse by bad actors or governments. It certainly would be used against the consumer by the copyright industry and tech monopolies; just like existing DRM schemes.
So we really should ask why we would want to walk further down this expensive, hostile, and dangerous path. Are we afraid that masturbation causes blindness?
Government in this case is forcing sites to collect PII to verify age not blocking content not blocking content themselves.
I am working under the knowledge that these age verifications are not theoretical (Its the end game of all the KYC startups from last decade)
If you are in the south in much of the US these ID checks are already forced and will only expand
A browser header gives the result without building a Database of people who like porn
Browser headers also put the responsibility on sites that promote dangerous things to kids (its in your best interest as a site that can deliver porn, things not suitable for kids to check and respect the header from a liability perspective)