Season 1s are great, setup, some payoff, a bit of lead into the overarching story. Then season 2 to X. The heroes win and then lose in the final episode, cliffhanger to next season. People get bored. Final season is announced and they wrap up the show.
It can certainly seem that way sometimes. Shows like The Handmaid’s Tale have been circling the drain of their own premise for a few years now. A big part of it, I think, is that they want to keep their main cast for as long as possible, which limits the options of what can happen.
Give me a mini-series, or even an anthology series, any day.
I love the limited scope of British TV series. They even managed to do only a few seasons of Law & Order, for crying out loud.
It’s not a creative or artistic choice; British channels simply have minuscule budgets compared to their American counterparts.
And yet it often leads to more satisfying narratives.
Look at Black Mirror. The British seasons are some of the best TV ever made. The American (Netflix) seasons have often been meh or downright awful, and derivative of the original seasons.
I think it’s for the best. Too much money in the US is spent on bullshit.
In any case, US versions of British series are almost certainly worse.
As far as I’m concerned, that show ended when the first season did (which corresponded with the ending of the book).
When I heard a season 2 was happening, I thought it might be based around the book’s epilogue. Instead, it’s the same story dragged out long past where it was supposed to end.
I still don’t get why so many were relating handmaids tale to real life. Just as annoying as those who think everything is 1984. Its a YA series, and not a particularly great one at that.
Not sure about the show, but Margaret Atwood has been at pains to point out that pretty much everything in her books has a real life precedent (albeit in different places at different times).
Child appropriations under the military juntas in Latin America, for one.
Do you really not see the parallels to real life of a religiously-ruled country who has enshrined in law ways to take control of fertile women’s reproductive rights? Really?
The US is becoming more and more Giléad with every passing day.
I don’t see anyone taking away rights of women, and America is secular.
You can draw parallels between anything, thats not really pertinent.
Do ya remember Roe vs. Wade or the recent decision that overturned it, thus removing the rights of a woman to decide whether or not she will continue a pregnancy?