It’s opening full for me. Probably geo paywalled…
It’s opening full for me. Probably geo paywalled…
Goes away with js disabled. Rest of page works fine.
Yeah hx. It was hx that finally made me use vi style navigation and now I choose vim over nano almost always.
deleted by creator
newpipe
The text. And probably images too (but the only mistake being the wrong port depiction (all c) says more human).
ai generated lol
scaled by system/themselves … looks like those are x11 apps. why is firefox into this? run it as native wayland with MOZ_ENABLE_WAYLAND
the bad guys use bots or services and are done. regular users have to endure while no security is added
put in other words, common users can’t easily become ‘bad guy’ ie cost of attack is higher hence lower number of script kiddies and automated attacks. You want to reduce number. These protections are nothing for bitnet owners or other high profile bad actors.
ps: recaptcha (or captcha in general) isn’t a security feature. At most it can be a safety feature.
stopping automated requests
yeah my bad. I meant too many automated requests. Both humans and bot generate spams and the issue is high influx of it. Legitimate users also use bots and by no means it’s harmful. That way you do not encounter captcha everytime you visit any google page, nor a couple of scraping scripts gets a problem. Recaptcha (or hcaptcha, say) triggers when there is high volume of request coming from same ip. Instead of blocking everyone out to protect their servers, they might allow slower requests so legitimate users face mininimal hindrance.
Most google services nowadays require accounts with stronger (like cell phone) verification so automated spam isn’t a big deal.
And what will you do if a person in a CGNAT is DoSing/scraping your site while you want others to access? IP based limiting isn’t very useful, both ways.
hCaptcha, Microsoft CAPTCHA all do the same. Can you give example of some that can’t easily be overcome just by better compute hardware?
There isn’t a good way to classify human users with scripts without adding too much friction to normal use. Also bots are sometimes welcome amd useful, it’s a problem when someone tries to mine data in large volume or effectively DoS the server.
Forget bots, there exist centers in India and other countries where you can employ humans to do ‘automated things’ (youtube like count, watch hour for example) at the same expense of bots. There are similar CAPTCHA services too. Good luck with those :)
Only rate limiting is the effective option.
The objective of reCAPTCHA (or any captcha) isn’t to detect bots. It is more of stopping automated requests and rate limiting. The captcha is ‘defeated’ if the time complexity to solve it, whether human or bot, is less than what expected. Now humans are very slow, hence they can’t beat them anyway.
unless brute force was done, it might be a cold boot, usb exploit or bootloader exploit by physically accessing the storage.
I think a DRE with a doctorate can tell for sure.