Differences between revisions 59 and 60
Revision 59 as of 2005-06-10 11:38:33
Size: 23695
Editor: dynamic-62-56-59-177
Comment: try syntax
Revision 60 as of 2005-06-10 13:14:27
Size: 24093
Editor: GvR
Comment:
Deletions are marked like this. Additions are marked like this.
Line 149: Line 149:

            Doesn't help.
            When combining except & finally, the answer is explicit: control moves only forward.
            (At the very least you can use this to remember the rule.)
            But when combining with & except (or finally), the handler for with is implicit,
            so there is no parallel heuristic to decide which one handles which.
            ["GvR"]

This page is dedicated to the public discussion of [http://www.python.org/peps/pep-0343.html PEP 343]: Anonymous Block Redux and Generator Enhancements.

I think this is a good one; I hope people agree. Its acceptance will obsolete about 4 other PEPs! (A sign that it fulfills a need and that the proposed solution is powerful.)

Please read the PEP; then add your comments here. Please sign your comments with your name. ["GvR"] (Guido van Rossum)


I feel like PEP 343 and PEP 342 work together in some important way, but it's not clear what that is. Does one depend on the other? What kind of things are possible if both are accepted (vs. just one of them)? I think this is important to understanding where PEP 343 is leading; some discussion of this would be appreciated. -- IanBicking

  • No, they are quite independent. PEP 342 is for making generators into real coroutines. Maybe throw() can help a tiny bit there but I think it's not essential. PEP 342 adds argument-less yield to PEP 343 which is a very minor improvement (no "yield None"). ["GvR"]


The optional extension mentioned would not need to lead to mistakes if the __exit__ call would distinguish between 'with X' and 'with X as Y'. -- Eric Nieuwland

  • Please explain. ["GvR"]
    • The example shows

      f = open(filename)
      with f:
          BLOCK1
      with f:
          BLOCK2

      which fails due to the proposed __exit__ method of file objects. Now if

      with EXPR:
          BLOCK1

      would result in an __exit__ call that can be distinguished from the call to __exit__ by

      with EXPR as VAR:
          BLOCK1

      then __exit__ could close the file in the latter case, while leaving it open in the former. A simple boolean added to the call might do the trick. -- Eric Nieuwland

      • But how will __exit__ know it's the last with-statement? Looks like you've just introduced a different kind of bug... ["GvR"]

        • If you use with EXPR as VAR: the translation creates the object so the clean-up can be done in closing the with block. If you use with VAR the object already existed and you should clean-up yourself. -- Eric Nieuwland

          • This is getting too silly. That would defeat the whole purpose of 'with var' -- you might as well leave it out and it wouldn't make a difference. Send email if you still disagree. If you "get" it, feel free to delete this whole exchange from the wiki -- it only serves to confuse people. ["GvR"]

The solution to this problem recommended in PEP 346 is to throw an explicit exception in __enter__ when a non-reusable template is reused. PEP 343 uses this approach for the generator-based templates, and I would expect an __enter__ method on file objects to do the same thing. -- Nick Coghlan

  • Yes, that would solve it the other way around. My sillyness and and Guido's worries resolved, I guess. -- Eric Nieuwland


next_throw() is easier to grep with next(). Niki Spahiev

  • Given that 99% of uses of both are implicit, I don't see how that matters. ["GvR"]

I don't want to teach beginners why python uses 'raise' sometimes and 'throw' other times. I predict: they're going to be coming from other languages, they're going to accidentally use 'throw' as the statement, and they're going to get mad at "all of python's weird quirks, like how it has both raise and throw". Drew Perttula

  • I'm not so worried. Your tolerance for language quirks seems rather low. :-) ["GvR"]


I've tuned out the recent conversation on these PEPs quite religiously, because I never thought they would result in anything even vaguely useful. Fortunately, I was dead wrong. PEP-343 is simple, elegant, and feels very Pythonic to me. I need to catch up now and think through the implications and corner-cases, but I can give a provisional +1 to the concept. Thanks to all those that didn't tune out and worked to get us to this point. -- Kevin Jacobs


A few immediate comments:

  1. Is g.throw(...) supposed to let you raise exceptions in other threads (by having g catch the exception you throw it, then raise its own exception for its caller)? The PEP should be clear about this. It would be great if the answer is yes and if that's the case, objects like queues and sockets should be turned into generators to permit cross-thread signalling using generator exceptions. But I had the impression that this would be difficult in CPython.
    • This is not the intention at all. The PEP specifically speaks of "where the generator g is currently suspended". By definition this implies that it is not running in another thread. You must have had threads on your mind too much

      recently to even think of this. :-) ["GvR"]

      • I will read it again, but I don't remember seeing anything that made me think the generator couldn't be suspended in another thread. phr
        • Generators aren't tied to a thread, but they can only be executing in one thread at a time. When a generator yields in one thread, another thread can resume it with next() or throw() -- but then the resumed generator executes in the thread that called next() or throw(). There's nothing new to this -- it's been like this since generators were invented. ["GvR"]
          • Well, throw() didn't exist when generators were invented, so throw hasn't always been like anything ;-). If throw is supposed to resume execution in the throwing thread rather than the thread where the generator was last suspended, the PEP should clearly specify that. --phr
            • It already says that throw() is just like next(). ["GvR"]
  2. I thought some of the original motivation of PEP 310 was to lessen the pain of giving up the idiom of expecting files to be closed when their last reference goes away. So if 343 no longer guarantees that the exit method will be run when the "with" statement finishes, it's lost some of its main motivating functionality. I wonder if having a more serious treatment of block scope, so that destructors are guaranteed to be called when a scope exits (by falling through or via an exception), would also take care of this issue.
    • Where did you read that the __exit__ method isn't guaranteed to be called upon exit from the with statement? The translation explicitly calls it! Read again. You may be confused with g.close(). That's only relevant when you use a for-loop without an explicit with-statement. ["GvR"]

There was some good discussion on clpy a while ago that I'll try to locate, and add some more comments during the weekend. --Paul Rubin (phr)


Overall I think this is a great proposal. It will allow a lot of things to be made a great deal more natural - probably many things that have not yet been considered.

I can see a lot of areas in the standard libs where improvements could be made using this construct. It will seem slightly inconsistent in areas where this is not done.

The only negative to the proposal is the extra complexity. The idea is a little magical, and may not be easy for new users to pick up. New users will not be forced to use it of course. However ironically the very usefulness of the idiom will probably lead to it being widely used in libraries, which will of course require new users to learn it sooner or later!

-- MichaelSmith


I like the idea and the "with .. as .." syntax, where readable keywords are used instead of obscure characters, like decorators use. It looks like Python to me. What I don't like is the use of decoratorated generators to create the templates. Generators don't seem to fit the problem here. First you have to add generator enhancements. Then you use a decorator, because the enhancements still don't make generators fit the problem. It is almost like generators and decorators are the latest and coolest additions to the language, so we're going to use them for everything we possibly can. I find example 4, using a simple class, to be much more readable than example 1. To me, example 4 shows the relative DISadvantage of using a generator template. I would dump the generator changes and just keep the with statement.

Patrick Ellis

  • Well, you have a choice not to use generators. :-) To many who participated in the discussion on python-dev, using generators to write templates is an essential part. The more state you carry over from __enter__ to __exit__ the more you will appreciate the generator. ["GvR"]


What happened to Fredrik Lundh's proposal (14/5/2005) to use the existing 'try' keyword instead of 'with', eg:

try opening(file) as f: ...
try locking(mutex): ...
try atomic_transaction(connection): ...

Try already has the idea of doing cleanup at block exit. I saw only reply in support of the idea on the list, but no other comments.

Thomas Leonard

  • When this were combined with an except or finally clause, it would be confusing to the reader which applies to which -- does the except clause

    catch exceptions in the __exit__ method, or does the __exit__ method see exceptions raised in the except clause? ["GvR"]

    • Python already has this problem when combining except and finally. If we take the same solution (make them mutually exclusive) then there's no problem. You still get a more readable syntax and one less keyword. Thomas Leonard
      • Doesn't help.

        When combining except & finally, the answer is explicit: control moves only forward. (At the very least you can use this to remember the rule.) But when combining with & except (or finally), the handler for with is implicit, so there is no parallel heuristic to decide which one handles which. ["GvR"]


I can't tell from the PEP whether or not it is recommended that objects themselves expose __enter__ and __exit__ (like locks and files). It mentions the possability, but all the examples do otherwise. Greg Ewing was the only person I remember disliking the idea, but then changed his mind (to a tepid +0.6).

I do understand the reluctance to pick the one-and-only way an object can be used in a with statement, but I suspect that most classes are factored well enough that the default with behavior is obvious (and more obvious than using a helper function like opening or locking). And any object that may have more than one with behavior can have 0 or 1 inherent behaviors and 1 or more helpers, but I seriously doubt there would be any cases like that in the standard library.

-- BenjiYork

  • Time will tell. The with-statement doesn't force you either way. The PEP doesn't strongly recommend either way because we don't have the experience with this yet to know which pattern is better. Personally I'd like to be cautious -- use explicit wrapper methods

    like opening() and locking() first, and add __enter__ and __exit__ methods to some objects later. ["GvR"]


Personally I find the half-hearted approach to adding features rather frustrating. I liked new style classes a lot, and I like classmethod and company. But it was frustrating that decorator syntax didn't come along until a couple releases after we needed them (putting aside how absurdly difficult a decision that turned out to be). I think it led to a very gradual and vague adoption of new class functionality. And maybe that's not bad, but as someone who likes the features it was frustrating. At the time, I felt like I was being punished for using new features because the syntax wasn't keeping up with other parts of the language. And the punishment continues, because I don't feel like I can use decorators for another year because I still have to support Python 2.3.

It also means much more tracking of versions. Now will I have to know that, in Python 2.4 I use try:finally:, in Python 2.5 with opening(), in Python 2.6 with open(), etc.? Can't we have a little faith in our imaginations and add a few __enter__ and __exit__ methods that seem so obviously useful, like to files and threading.Lock objects?

Adding features is always a problem, but adding them gradually just makes it that much worse. I think the reception of this feature is positive enough (+1 from me, BTW) that it shows people understand why it matters and understand where we should start using it right away. -- IanBicking

  • Trust me, it is frustrating for me too. But given the large number of users we have to go slowly. Compare it to upgrading a freeway or an airport. It's much more complicated than building a new one, because you have to accommodate existing traffic while the work is going on. But the alternative is Perl 6...

    In this particular case, I actually believe that adding __enter__ and __exit__ to file objects is wrong, but for lock objects it is right. Does that help? ["GvR"]

    • So you don't think we should detect with open(X) as a special language construct?!? Sorry, just trying to think about how Perl would approach this... ;)

      I'm certainly not set on any particular example, I haven't thought it through too much. I think I understand why it's wrong for files -- release implies acquisition, and files can't be acquired in a with statement (they were already acquired when they were opened). Lock objects can be acquired. The equivalent for files might be if there was a path object; acquiring the path object could mean opening it. Wait... no, that's not right. Even putting aside the different ways you can acquire a path (reading and writing), you don't acquire it globally. I guess it makes more sense to me why, for most cases, the objects we have now won't need to grow these methods -- instead we'll be adding methods and functions that will build the new kinds of objects that have __enter__ and __exit__ functions.

      Though it's hard. How do you explain the difference between open() and opening()? It seems like it will be a strange and seemingly arbitrary idiomatic difference if you don't understand the underlying implementation -- and I think the underlying implementation of with (with all the special methods) is not something you'd want to expose to a beginner, though obviously you do want to expose file objects. I don't have any constructive suggestion there, though. Just *please* lets make opening a builtin! We shouldn't pay too much attention to the irrational fear of adding new builtins that some people possess.

      But I digress. It seems like 2.5 (where I assume this will be added) isn't just around the corner, and there's time enough to collect use cases that can go into the standard library along with the new language feature. This will also give people who weren't tracking development a chance to see how this stuff should be used (assuming the additions are implemented wisely). -- IanBicking


(Maybe more this proposal to a separate page? It's also discussed below. ["GvR"])

Is the door open for an optional-indentation syntax (not necessarily now)? It has been discussed in length on comp.lang.python. Basically, I expect most the time with-statements to end at the end of the current block, resulting most of the time in over-indentation. Optional-indentation would create a precedent in Python and can be added later, but I was wondering what was the feeling about it. One thing I don't like when I do a try/finally for a file read/close in Python is not only the part that PEP343 is fixing, but also that it indents my code uselessly.

Nicolas Fleury

  • My first reaction is that your proposal is not very Pythonic in nature. The "drop the colon" version is different from any Python syntax so far; everything else either always has a colon or never has one. This makes it easy to spot mistakes where a colon is forgotten (or accidentally added). If you frequently find yourself allocating multiple resources (which I kind of doubt -- Python is not C++ and what you see there doesn't necessarily translate) I think a better solution is the proposal that looks like
    •         with combined(EXPR1, EXPR2, EXPR3, ...) as VAR1, VAR2, VAR3, ...:
                  BLOCK
    We can talk later (post-PEP-343) about making this implicit when writing
    •         with EXPR1, EXPR2, EXPR3, ... as VAR1, VAR2, VAR3, ...:
                  BLOCK
    ["GvR"]


Will 'as' be a true keyword or will it continue to be the semi-keyword it is now, where "as" is allowed as a variable name and in the import statement? -- Andrew Dalke

  • A real keyword. (The PEP says this.) ["GvR"]
    • I read that in the PEP but I wondered if the logic from PEP 221 ("Import As") to allow 'import ... as ...' would apply here as well, to allow old code to still run. For example, code that uses "as" already (pehaps to store the number of 'a's in a string) will break. Looking at 221 again says it works only because the terms in an import statement are not expressions, but they are expressions here. Hence there's no way to allow this PEP and have 'as' as a variable. Understood now. -- Andrew Dalke
      • And 'with' is also a new keyword. 'as' has been threatening to become a keyword for generations now. I don't have a lot of pity for people who still use it as a variable. ["GvR"]


See WithStatementAndOpenGl


(Maybe more this proposal to a separate page? It's also discussed above. ["GvR"])

Brought up on c.l.py, is there need for a syntax like

  with EXPR1 [as VAR1][, EXPR2 [as VAR2] [, ...]]:
    CODE

which is exactly equivalent to

  with EXPR1 [as VAR1]:
    with EXPR2 [as VAR2]:
      ...
        CODE

The idea was that if multiple with statements were common then this would reduce the visual depth of indentation. For example,

  with db1.lock():
    with db2.lock() as L2:
      print "db2 lock expires", L2.exiry()
      transfer(db1, db2)

could be turned into

  with db1.lock(), db2.lock() as L2:
    print "db2 lock expires", L2.exiry()
        transfer(db1, db2)

We have no idea if this will occur often enough to be useful.

Ahh, Andrew Dalke again here. Timothy Delaney responded to this on c.l.py:

  • "It wasn't explicitly rejected, but the feeling seemed to be that it was an unnecessary complication as far as PEP 343 is concerned. There's nothing stopping another PEP proposing this as an extension to PEP 343, and there's nothing stopping that being in Python 2.5 if it's accepted."
    • That's my feeling too -- let's explore one idea at a time

      (despite IanBicking's complaint about fractional progress). One comment: VAR, if present, should have as many pieces as there are EXPRs on the left; the above example would have to be as dummyl, L2. ["GvR"]

      • Isn't it better to mimic the import-as syntax? If not, we would end up with things like

        locking(lock), opening(file) as _, file. Nicolas Fleury

        • Tuple-unpacking can occur, so with foo() as a, b will unpack into two variables; if you allow "EXPR1 as VAR, EXPR2 as VAR" then this wouldn't be possible without some extra parens (e.g., with foo() as (a, b)) -- IanBicking

          • Ok, so that means that mimicking import-as would not be backward-compatible. It means in a sense that we are already deciding if we prefer
                           with foo1(), foo2(), foo3() as _, x, (y, z): ...
            or
                           with foo1(), foo2(), foo3() as x, y, z: ...
            to
                           with foo1(), foo2() as x, foo3() as (y, z): ...
            That's right? Nicolas Fleury
            • (Please move this thread to a separate page. ["GvR"])


Looks like Guido has already found the WMD (=most controversial propsal of the year) to discuss at PyCon 2006. How will this be explained to the rank-and-file Pythoneers who don't have a math degree and already have trouble grokking decorators with arguments? The only advantage to us peons seems to be opening(), locking() and synchronize() -- not insignificant, but do they justify major surgery to/abuse of generators, an unexpected meaning for 'with', and significant code bloat and complex exception-handling rules? Are exceptions in generators so broken they require emergency treatment? Not knocking this entirely -- the BDFL has always been proven wise (except when he rejected the ternary operator :) ) -- but it does give some of us worrying visions of Python turning into the huge kludgey mess it has so far avoided.

The PEP and discussion starts with the syntax and delves mostly into borderline cases, with only passing references to the motivations behind this change. Perhaps somebody knowledgeable could write an intro to the proposal and link to it in the PEP, something that starts with the users' needs and how widespread they are, then shows how each change meets those needs, starting with a normal @locking in Python 2.4 and ending with a with+generator+throw example. Or start with the "Python 3000 way" -- what we ideally want -- then show the compromises for 2.x (like .__next__ vs .next).

Generators are the most popular addition to Python since 1.5.2. Most people think of them as a "lazy sequence". They then proved useful to eliminate top-level 'for' blocks in functions, as "poor man's multitasking", and even spawned generator expressions. All these are sequential in nature: multiple elements. But the proposal advocates widespread use of single-element generators, and requires major surgery and novel use of 'yield' and '.next' to boot. It looks like a kludge, like passing [x] to a function so it can modify 'x' in place. Is this perhaps a sign that a new object is needed for this, rather than piggybacking on the generator-iterator?

In many languages -- including Modula and DTML -- 'with' means "create simple variables that alias to the object's attributes"; e.g.,

{{{with obj:

  • print foo}}}

means print obj.foo. This has been proposed for Python too. I'm not advocating it but the current proposal would preclude the possibility since this 'with' is apparently incompatible with it. It breaks the tradition of enumerate(), which was not called enum() due to false connotations in other languages.

-- Mike Orr

  • I'll ignore the populist rant. Regarding the other 'with', I used to think that it would be marginally useful, but after reading the C# designers's explanation why C# doesn't have it

    ([http://msdn.microsoft.com/vcsharp/programming/language/ask/withstatement/ here]), I'm glad to see it relegated to oblivion. ["GvR"]

    • Anyway, if some people like so much the other with (which I find totally useless in Python, using = does the job), they can do the following:
              with renaming(somePath.myObject) as o:
                  o.whyNeedingABlock()
      However, o is accessible outside the block (even if Pythonic, I wonder if it's a good thing). Nicolas Fleury

WithStatement (last edited 2009-05-20 19:39:45 by SkipMontanaro)

Unable to edit the page? See the FrontPage for instructions.