You may have heard this before: “eval is evil”. But is it really? And if so, why is it?
I’ll start with a quote:
Since the eval method executes script, you have to be more careful about user supplied data that may end up in an eval statement. Suppose you have an input field whose value is saved to the database, and that value is later used in an eval call. This code injection could certainly cause all sorts of serious problems for an end user. This is evil, but the real evil is the code injection vulnerability, not eval itself. If you use techniques to prevent such code from getting into your eval statement in the first place, you shouldn’t have to avoid eval for user supplied data. IMHO, this does not make eval evil, but is a good reason to avoid it when you can.
The most practical use of eval in the recent past has been for the parsing of JSON-formatted data when retrieving data via Ajax, but native JSON support is now available in all modern browsers as of the ECMAScript Language Specification, 5th Edition in Dec. 2009. If eval() is needed for JSON parsing, then in browsers that support it, JSON.parse() should be used. This means that eval for JSON parsing is only needed for older browsers.
Browsers that support the built in JSON Object and Grammar (reference):
- Firefox 3.5+
- Internet Explorer 8+
- Opera 10.5+
So my feeling on the subject: avoid eval when what you’re trying to achieve can be done without it. You can avoid performance degradation and reduce the risk of code injection security issues, and unless you need to support an old browser like IE7, you don’t need it for JSON.
Back to the quote… I agree that eval is often misused and often in the context of bad coding (I’ve seen a lot of it), but if you know what you’re doing, then your code should be in the “sometimes useful” bucket, and that certainly is not evil.