Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

I Am Not a Robot

#1
Zinjanthropos Offline
PWhy is it important that I am not a robot when signing onto some websites? I get the feeling something has passed me by. What happens if I don't click the box, do I get a visit from someone wearing sunglasses and a trench coat?
Reply
#2
C C Offline
Often happens when a proxy or VPN or Tor is detected (which roaming botware must apparently use a lot to obscure their origins / locations).

My tentative guess is that websites don't like their bandwidth or accessibility being consumed by bots which aren't receptive to ads (in their own disinterested way) anymore than those human visitors whose browsers have either blockers turned on or javascript disabled. (With regard to the latter, some places will issue an advisory or outright deny admittance to real people if ads and cookie / tracker blockers are detected or JS is off.)

~
Reply
#3
Zinjanthropos Offline
(Oct 8, 2018 05:42 PM)C C Wrote: Often happens when a proxy or VPN or Tor is detected (which roaming botware must apparently use a lot to obscure their origins / locations).

My tentative guess is that websites don't like their bandwidth or accessibility being consumed by bots which aren't receptive to ads (in their own disinterested way) anymore than those human visitors whose browsers have either blockers turned on or javascript disabled. (With regard to the latter, some places will issue an advisory or outright deny admittance to real people if ads and cookie / tracker blockers are detected or JS is off.)

~

In the future, when a robot is going about our daily tasks, their owner should be able to have one bot log in to a website as his/her representative. A future human may not even have to get close to a computer to log in. OMG...those poor bastards of the future, bombarded with ads from their own Bots.
Reply
#4
stryder Offline
Bots can behave badly. They can either be misconfigured or built to be that way. Bots can replicate code/posts that can create spam or can be used to detect weaknesses or engage exploits.

There is a mixture of concerns from a webmasters position.

for instance most access to this site are bots and the number of bots increases over time (people try creating their own, new start-ups and of course exploits etc) if the site was accessible to all bots now (and in the future) one day it would fall over. It could be that the company providing the space locks a limit on how much access, it could be down to many access attempts or writes to a database. The site can effectively suffer a "DoS", or even potentially have it's script engine misbehave or the database break.

What do bots do when they access here?
Well some legitimate ones are scrapeing caches of the site for use with search engines. (Google copies site data so they can then use their servers/algorithms to equate a search placement, should of course it be allowed etc.)

Others are looking for particular exploits where they use specifically crafted URLs to attempt to negate the servers software.

Further still others sign up as humans to either create a zombie hoard of silent accounts (which incidentally can then be used to create a list of accounts/passwords that can then attempt to be used to blackmail companies into thinking their databases have been exploited, you'd be surprised how much people are willing to buy a list of fact account names and passwords for. It's incidentally why I keep them to a minimum, and the ones that do make it on this site are placed into a moderation queue encase of future activity)

Zombie accounts can be activated in the future to flood spam or manipulate opinion in a coordinated attack, with the hope that exploiting the accounts tenure will get them around some sites security settings.

The problem is that while a bot can misbehave due to the nuances of site differences, there is still the potential for exploiters to utilise Turk systems to get human conspirators to aid them with an bot related failsafes. Those networks probably pay a human so many cents per solved captcha (although exploited websites can be used to serve extra captchas that unknowing victims solve for botnets to then exploit.)

In a nutshell, it's to slow down how easy it is for robots to mess things up.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  The body is the missing link for AI + Rogue robot death + The Robot Protocol C C 0 558 Mar 16, 2017 12:32 AM
Last Post: C C



Users browsing this thread: 1 Guest(s)