Just finished watching the second episode...
It was pretty good. Both Episodes 1 and 2 were good. Episode 2 was a surprise since, when the credits rolled, it showed that this was the episode Jodie Foster wrote. Nice bit o' candy there...
I think I understand a little better what Black Mirror is. It's not dystopian as I first thought. It doesn't show a world destroyed by the evils of technology. That evils can be done with technology, sure. But it's like a disease. All diseases are bad, and some have killed a great many people, but no disease has ever come close to threatening the human race. They are challenging and present woeful burdens, but they aren't the end all be all. Likewise, BM is showing us not a technological apocalypse, but the impact of technology. It's like a drug commercial listing the side-effects of some concoction with a ridiculously euphoric sounding name. Technology will and can definitely be a burden on the human race, but it's evils are probably not enough to destroy the human race, right?
Well...that's where I'm now split. Let's take the diseases of the world. We haven't had one that's come close to wiping us out, but those are all natural diseases. However, when you start trying to weaponize diseases, that's a whole different thing. Artificial Intelligence does have the capability to evolve into something that sees human beings as better off not beings, but I think we would likely have found more interesting ways to off ourselves before that happens.
You see...the second episode of BM-S4 was about control. It was about some other things, surely, but it was mostly about control. Control justified by a topically righteous cause- a mothers instinct to protect her child. However, because her control was too total- too intimate- it winded up destroying the basis of their relationship in the first place. It would be easy to parallel this to macro-societies control over the public, but that's boring. It's our control over ourselves that makes me shiver. It's the imminent demand to be able to control ourselves that makes me think that we, indeed, will be the cause of our own demise. Why?
At the very least there are two you's. There's conscious you and there's unconscious you. Conscious you may know a lot, but unconscious you likely know's a lot more. It controls your body and it's going off of generations of biological input to do this. It's already figured out how to keep you in balance in the best way possible. But what happens when technology becomes so strong that it will allow the ordinary public- or even any part of humanity- to manipulate it's own make-up at will? What happens when we assume too much control over ourselves? All for the most noble or imminent of reasons, of course? We might be hot-fixing some immediate need, or quenching a desire, but are we going to be losing something down the line. It's very much like the Faustian contract. For the first 500 years, things go great, but after that, things start to fall apart, because you only cared to see the immediate benefits and didn't care about the consequences down the line.
Black Mirror is awesome, so far. I was losing faith that I'd ever find something I liked on TV again, but this was a save. Maybe I'm speaking to soon, but...not really. Even if the entire season isn't good, I'm grateful for the shows I like and how they make me think long after they end. They resonate in a meaningful way.