Malicious bots posing as regular users make up 30 percent of traffic to government websites – well above the overall average of 20 percent of web traffic for non-government sites – according to a report from Distil Networks.

The report, which highlights bot traffic across the internet, notes that bots on government sites tend to focus on scraping information from election registration and business registration sites. Distil defines “bad bots” as ones that scrape data without permission or undertake criminal activities.

What do you think about MeriTalk’s news coverage? Take our reader survey and be entered to win one of ten $20 Amazon gift cards. Take the survey.

In total, bad bots make up 30 percent of traffic, good bots make up around six percent of traffic, and human users fill out the remainder, with around 64 percent of total traffic. While government may see more bad bots than the average sector, some private-sector areas attract a large share as well, including financial, ticketing, and IT and services sectors.

“Bad bots are evolving and are more sophisticated than ever. Increasingly they’re mimicking real human workflows across web applications to ‘behave’ like real users,” the report states.

Education was also among the top targeted sectors for bad bots, with the third most bad bots out of any sector, at 38 percent.

“Bots are deployed by malicious operators looking for research papers, class availability, and to access user accounts,” the report stated regarding the education sector.

Overall, bad bot traffic fell from 21.8 percent to 20.4 percent across the web, with good bots making up 17.5 percent, and humans at 62.1 percent.

Read More About
More Topics
MeriTalk Staff