An Internet bot, also known as web robot, WWW robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. The largest use of bots is in web spidering, in which an automated script fetches, analyzes and files information from web servers at many times the speed of a human.
Given the exceptional speed with which bots can perform their relatively simple routines, bots may also be implemented where a response speed faster than that of humans is required. Common examples including gaming bots, whereby a player achieves a significant advantage by implementing some repetitive routine with the use of a bot rather than manually, or auction-site robots, where last-minute bid-placing speed may determine who places the winning bid - using a bot to place counterbids affords a significant advantage over bids placed manually.
Bots are routinely used on the internet where the emulation of human activity is required, for example chat bots. A simple question and answer exchange online may appear to be with another person, when in fact it is simply with a bot.
While bots are often used to simply automate a repetitive online interaction, their ability to mimic actual human conversation and avoid detection has resulted in the use of bots as tools of covert manipulation. On the internet today bots are used to artificially alter, disrupt or even silence legitimate online conversations. Bots are sometimes implemented, for example, to overwhelm the discussion of some topic which the bot's creator wishes to silence. The bot may achieve this by drowning out a legitimate conversation with repetitive bot-placed posts which may in some cases appear to be reasonable and relevant, in others simply unrelated or nonsense chatter, or alternatively by overwhelming the target website's server with constant, repetitive, pointless bot-placed posts. These bots play an important role in modifying, confusing and silencing conversations about, and the dissemination of, real information regarding sensitive events around the world.
The success of bots may be largely due to the very real difficulty in identifying the difference between an online interaction with a bot versus a live human. Given that bots are relatively simple to create and implement, they are a very powerful tool with the potential to influence every segment of the world-wide web.
Efforts by servers hosting websites to counteract bots vary. Servers may choose to outline rules on the behavior of internet bots by implementing a
robots.txt
file: this file is simply text stating the rules governing a bot's behavior on that server. Any bot interacting with (or 'spidering') any server that does not follow these rules should, in theory, be denied access to, or removed from, the affected website. If the only rule implementation by a server is a posted text file with no associated program/software/app, then adhering to those rules is entirely voluntary - in reality there is no way to enforce those rules, or even to ensure that a bot's creator or implementer acknowledges, or even reads, the robots.txt file contents.
continue reading here : https://en.wikipedia.org/wiki/Internet_bot
No comments:
Post a Comment