A Crawler-based study of Spyware on the Web
This paper examines the threat of malicious spyware from an Internet perspective. The researchers performed a study of the Web using a crawler on executables and conventional Web pages for malicious objects.
In order to determine the spyware infected executables in the Web, they first determined whether a Web object contains executable software, then downloading, installing and executing the software on a virtual machine and finally analyzing whether the installation or the execution of the software caused a spyware infection.
They also talked about the types of spywares found such as Adware, keyloggers, Trojan downloaders, browser hijackers, dialers.
Certain defense mechanisms against spware such as signature-based tools and blacklisting were discussed in detail.
Signature-based anti-spyware tool is one of the most common defenses. It compares a database of spyware signatures to files and processes running on a client computer, it can detect when the computer is infected with known spyware programs.
Blacklisting: To keep a check on spyware, black-lists which contained URL’s or domain those are suspected to contain spyware. Hence easier for a firewall or proxy to block clients from accessing listed slides.
Then it goes on to explain Drive-by download attack, its infrastructure, point it originates, the infections it causes and the effect it has on FireFox browser.
· A Drive-by download is a program that is automatically downloaded to the user’s computer, often without the users consent or knowledge. It replaces the user's home page and changes browser settings.
· It occurs when a victim visits a Web page with malicious content.
· They examined URLS from eight different Web categories and calculated the fraction of URLs and domains that were infectious in each. They found that there were no drive-by download attacks in either “kids ”or ”news” sites whereas more attacks on the “pirate” sites.
Glossary
· A web crawler is a program or automated script which browses the WWW in a methodical, automated manner. It’s a type of bot or software agent. As the crawler visits the URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit.
In order to determine the spyware infected executables in the Web, they first determined whether a Web object contains executable software, then downloading, installing and executing the software on a virtual machine and finally analyzing whether the installation or the execution of the software caused a spyware infection.
They also talked about the types of spywares found such as Adware, keyloggers, Trojan downloaders, browser hijackers, dialers.
Certain defense mechanisms against spware such as signature-based tools and blacklisting were discussed in detail.
Signature-based anti-spyware tool is one of the most common defenses. It compares a database of spyware signatures to files and processes running on a client computer, it can detect when the computer is infected with known spyware programs.
Blacklisting: To keep a check on spyware, black-lists which contained URL’s or domain those are suspected to contain spyware. Hence easier for a firewall or proxy to block clients from accessing listed slides.
Then it goes on to explain Drive-by download attack, its infrastructure, point it originates, the infections it causes and the effect it has on FireFox browser.
· A Drive-by download is a program that is automatically downloaded to the user’s computer, often without the users consent or knowledge. It replaces the user's home page and changes browser settings.
· It occurs when a victim visits a Web page with malicious content.
· They examined URLS from eight different Web categories and calculated the fraction of URLs and domains that were infectious in each. They found that there were no drive-by download attacks in either “kids ”or ”news” sites whereas more attacks on the “pirate” sites.
Glossary
· A web crawler is a program or automated script which browses the WWW in a methodical, automated manner. It’s a type of bot or software agent. As the crawler visits the URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit.
0 Comments:
Post a Comment
<< Home