I guess what interests me the most about existential risks is contextual. 25 line dissertation:
Humans spent the last ~12000 years mostly hunting, foraging, and farming.
There was nothing we could really do to cause our own destruction.
We only had to worry about things like starving, bad weather, fires, certain animals, and other tribes coming to take our shit.
However we also only had so many ways to keep ourselves healthy; i.e., if someone got sick, broke a leg, etc, there was not much we could really do.
But all in all, our extent and reach was very limited.
Slowly we became able to solve many of our problems; providing stronger shelters, forging steel for things like tools and weapons that could be passed down, furniture, etc. We had yolked the animals and figured out how to use them for labor pretty effectively.
Then we discovered how to burn coal, and then oil. And things really picked up from there.
All of a sudden we could really get to solving issues, but since then, every issue we 'solve' increases complexity and interrelated dependencies.
Your eggs are no longer YOUR responsibility; i.e., go feed your chicken with crops you/your village grew and take his eggs.
Now, eggs are fully dependent not just on the farms, factories, complicated equipment, trucks, weather, water systems, roadways, electricity, gas, gasoline, and the very economy that demands it be so.
And if any one of these things goes away there goes the eggs.
Due to all of this increased complexity we really have introduced so many really tight binds.
Example, the only reason we haven't entered WWIII (officially) is due to the fact that all major powers have tons of nukes. Keeping these pointed at each other helps keep things peaceful somehow, but also of course increases the odds of severe retaliatory strikes if so.
Imagine if there were a terror attack, where a nuke randomly went off somewhere in the USA, then the entity tells the US Govt receives a message that there are 9 more ready to go off in random locations at the push of a button if 'X' doesn't happen.
Also, the more problems we try to solve, the more complex it gets, therefore the more we put toward trying to solve that; which is really akin to never having a perfect sandwich; you keep adding ham, mayonnaise, cheese, lettuce, over and over again until that sandwich simply falls over.
We've now put so much resources toward the internet of things, and amazon, and overnight shipping, but seriously, if air travel becomes expensive, if there is a huge data breach, if tariffs and trade becomes harder, it could go kaput!
In fact, I worry that we keep increasing our data security to keep free from hackers, but every time we do, the hackers find a way in. Where does this end, with some sort of end-all-be-all security solution, or the hackers always winning?
This is the cycle I see in almost everything. Trying to make the whole world drive electric cars only to find out we can no longer mine lithium and steel fast enough to keep up with demand, then having wasted not only all of those cars and infrastructure but all the fuel for extraction in the process, then reverting to a worse baseline.
Its just a race to the bottom across the board. People want to be the first to market in everything but to be the first to market means you curb all risks and do zero planning and mitigation.
Literally everything we are doing is completely unprecedented, and any number of things can turn into a catastrophic nightmare. I don't even watch the news; i assume they fill it with bad news just to keep people outraged and looking inward at the system as a whole.