Sunday, September 20, 2009

Independent Study: Spectator: Detection and Containment of JavaScript Worms

(This is one of a series of posts about papers I'm reading for an independent study with Prof. Evan Chang at the University of Colorado, Boulder. The format is similar to that of a review of a paper submitted to a computer science conference. There are already-published papers, so I'll be writing with the benefit of hindsight, especially when the paper was published at least several years ago.)

Submission: Spectator: Detection and Containment of JavaScript Worms [PDF]

Please give a brief, 2-3 sentence summary of the main ideas of this paper:

Detecting JavaScript worms can be accomplished by adding a tag to content uploaded to a Web server, associating the tag with the IP address of the client that originated the upload, and using tags to identify propagation chains. A long chain of propagation is a signature of JavaScript worms, so identifying a long chain (where length is user-defined) should be sufficient for detecting JavaScript worms. One a worm has been identified, containment is a matter of disallowing further uploads along the chain of known-infected clients until an administrator tells the Spectator proxy that the chain is safe.
What is the strength of this paper (1-3 sentences):
The strength of this paper is that it proposes a solution which can be implemented simply as a proxy server in the domain of a Web site operator, and which doesn't require any modifications or plug-ins in the web browser. With a reasonable amount of time and capital, a large Web site operator can implement this solution today.
What is the weakness of this paper (1-3 sentences):
Not to be so generous, but I don't find any problems with it.
Evaluation
This is on the whole an excellent paper. I would like to see it in SIGFOO this year.
Novelty
I'm not sure how novel this is, as XSS attacks and JavaScript worms aren't my specialty, but naively, it seems novel. In the spirit of many useful papers in computer science, it combines a number of sound techniques to solve an important problem.
Convincing
I'm convinced not only that the detection and containment algorithms work correctly and efficiently, but also that the author of a JavaScript worm would have a hard time subverting the system. The authors' approach is essentially invisible to the browser, with the exception of some JavaScript injected by the Spectator proxy (Figure 5) (which appears to be designed so as to prevent subversion by malicious code).

While the empirical data in section 5.2 (Overhead and Scalability) is encouraging, a Web site operator would obviously want to subject any implementation of this system to rigorous testing before deploying it.
Worth Solving
Very much so!
Confidence
Medium.
Detailed Comments
I'm curious whether it's possible for each page downloaded from a site to contain JavaScript code to validate that only Web site operator-provided JavaScript is executing in that page. Something like an unmodifiable onload event handler similar to the unload event handler in Figure 5.

Some papers to follow up on:

S. Meschkat. JSON RPC: Cross site scripting and client side Web services. In 23rd Chaos Communication Congress, 12 2006.

T. Pietraszek and C. V. Berghe. Defending against injection attacks through context-sensitive string evaluation [PDF]. In Proceedings of the Recent Advances in Intrusion Detection, Sept. 2005.

Y. Xie and A. Aiken. Static detection of security vulnerabilities in scripting languages [PDF]. In Proceedings of the Usenix Security Symposium, pages 271–286, Aug. 2006.

Y.-W. Huang, F. Yu, C. Hang, C.-H. Tsai, D.-T. Lee, and S.-Y. Kuo. Securing Web application code by static analysis and runtime protection. In Proceedings of the Conference on World Wide Web, pages 40–52, May 2004.

No comments: