开发者

Is it bad to use vars()

开发者 https://www.devze.com 2023-03-20 10:49 出处:网络
I have a string expression that I need to do eval on. This expression may have key names from a given dictionary and values for those keys may or may not be strings. For my eval to be able to evaluate

I have a string expression that I need to do eval on. This expression may have key names from a given dictionary and values for those keys may or may not be strings. For my eval to be able to evaluate those names, I am creating variables with the key names that appear in that dict and assign it the value from the dict as in the example below. But from what I have read here, vars() should nto be used. This has made me worry about the stability as well as risks of what I have implemented. Any thoughts or suggestions to how to implmenet this in a better way?

def test(e1,d1):
 for k,v in d1.iteritems():
     if k in e1:
        vars()[k]=v
 return eval(e1)

test('x+y', {'x':1,开发者_如何学C 'y':2})

Thanks!!


There is a better way to do what you're trying to do: the optional globals and locals arguments to eval.

def test(e1, d1):
    return eval(e1, globals(), d1)

does the same thing as your code but without needing to muck with vars. If you can get away with passing an empty dictionary as the second argument instead of globals(), that will insulate you from side effects of e1.


Why not do this:

>>> eval('x+y', {'x':1, 'y':2})
3


According to this, it looks like you can get the same effect by doing

def test(e1, d1):
  return eval(e1, locals=d1)
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号