Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even more fundamental is the nonlocal/global stuff required in order to avoid declaring your variables. Many people are surprised by what this snippet does:

    a = 0
    def f():
        print(a)
        a = 1
    f()
    f()


All it does is produce an error: "UnboundLocalError: local variable 'a' referenced before assignment". The second f() call makes no difference.


Is that what you expected from such a simple piece of code?


Well it's f that has the problem, rather obviously, and the other three lines are unrelated. So I don't see how this is some especially simple way of triggering that error.


The other three lines are not unrelated. If you just take the first three lies (and the last one), it's fine:

    a = 0
    def f():
        print(a)
    f()
Maybe this is all obvious to you, but I'll bet you can't name any other language which behaves this way. Compare the original Python to JavaScript (or Lua, Perl, Tcl, Ruby, C, Scheme, Rust, Clojure, or ...):

    a = 0
    function f() {
        console.log(a)
        a = 1
    }
    f()
    f()
This one behaves how I think most people would initially expect the Python snippet to behave. First it prints 0, then it prints 1. (Of course JavaScript has it's flaws too...) If you don't believe me, ask your coworkers and friends what they think it does before running it.

My only real point is that Python conflates variable declaration and variable assignment in a way which initially seems like a friendly time-saver, but which ends up being pretty subtle and confusing until you've learned its quirks. All of that just to avoid declaring your variables (in some fictional version of Python):

    var a = 0
    def f():
        print(a)
        a = 1
    f()
    f()
Here it would be clear which are declarations + initializations, and which are only assignments. And for just the cost of typing the word "var", the compiler could tell you when you've made typos in your variable names. As an added bonus, you could get rid of the "global" and "nonlocal" keywords.


Of course I'm not a good judge, I have been programming Python since 1.4 and my colleagues also all have Python experience.

But at least I like the idea that it gives an error message when confronted with ambiguity. That way it doesn't do something you didn't expect silently. Sadly it only gives it at runtime, not at compile time.


I'm not sure what your point is about Python 1.4. Maybe it's that you're past the point where these things trip you up.

However, there are plenty of cases where Python won't give an error message too. Make a typo somewhere in the middle of your function, and it'll quietly introduce a new variable instead of letting you know. A rarely used branch of an if-statement could hide this indefinitely. This could also be avoided (or at least mitigated) by requiring variables to be declared explicitly.


js also doesn't behave as you want:

  var a = 9;   
  function foo() {
      console.log(a);
      var a = 12;
  }
  foo();

  $node foo.js
  undefined
The same goes with ruby.


You’re sidestepping the problem. The problem is that Python uses the same syntax for variable declaration and variable assignment, and this can lead to unexpected behavior. That’s far from normal in the programming world, making it more unexpected.


JavaScript is far from perfect, but at least it's clear you declared two variables.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: