My first answer is Game of Thrones which got considerably worse with each season. lol To the point that I don't care about any spinoff they do.
Other than that, I would have to say The Walking Dead, I liked it at first, but they lost me several seasons ago. Same with West World.
Other than that, I would have to say The Walking Dead, I liked it at first, but they lost me several seasons ago. Same with West World.