If you have a function that returns a list, and that function would work just as well if you returned a different iterable instead, you could probably turn that function into a generator function.
Here we have a function called
def stop_after(iterable, stop_item): """Yield from the iterable until the given value is reached.""" elements =  for item in iterable: elements.append(item) if item == stop_item: break return elements
This function accepts an iterable and a value, and it returns a list of all the items in that iterable, up to and including that value:
>>> results = stop_after([2, 1, 3, 4, 7, 11, 18], 4) >>> results [2, 1, 3, 4]
Right now, our function builds up a list, and then it returns that list. So it does all the work of computing the items in that list right when we call it.
But what if we wanted our function to return a lazy iterable instead?
How could we return an iterable that doesn't actually compute its items until we start looping over it?
We could turn this regular function into a generator function.
To do that, we need
yield statement is what turns a function into a generator function.
As long as we're only adding new items to the end of this list (we aren't modifying items, removing items, or adding items somewhere else besides the end) we could probably replace each of our
append calls with a
yield statement, and then delete our list entirely:
def stop_after(iterable, stop_item): """Yield from the iterable until the given value is reached.""" for item in iterable: yield item if item == stop_item: break
We've just turned what was a regular Python function into a generator function.
When we call this generator function, it doesn't actually run the code in that function:
>>> results = stop_after([2, 1, 3, 4, 7, 11, 18], 4) >>> results <generator object stop_after at 0x7f92ecb2c190>
Instead, it gives us back a generator object that will do work as we loop over it.
Generator objects can be passed to the built-in
next function to get just its next item:
>>> next(results) 2 >>> next(results) 1
But that's not usually how generators are used.
The typical use for a generator object is to loop over it to get all of its remaining items:
>>> list(results) [3, 4]
At this point, we've exhausted our generator object; we've fully consumed all the items within it.
This means that if we loop over our generator object again, we'll see that it's empty:
>>> list(results) 
If you have a function that returns a list and as you build up that list, you're only ever adding items to the end, you can probably turn that function into a generator function by replacing all of your
append calls with
Need to fill-in gaps in your Python skills?
Sign up for my Python newsletter where I share one of my favorite Python tips every week.
Generator functions look like regular functions but they have one or more
yield statements within them. Unlike regular functions, the code within a generator function isn't run when you call it! Calling a generator function returns a generator object, which is a lazy iterable.
Need to fill-in gaps in your Python skills? I send weekly emails designed to do just that.