This is a bit of a side point, but I don't think the amount of information is causing us trouble. It's the speed of it.
Every day we're being asked (or exposed) to so many new topics (and movements) that our attention has to get divided for minutes at a time amongst so many things.
And what does that lead to? Given so many things to pay attention to, we then have to resort to simplification -- symbols -- to decide what we want or not, what to vote for or against, when there's a world of complication underneath.
We go from symbol to symbol, giving our upvote that causes huge amounts of disruption to the people and issues that have to deal with the day-to-day reality after we've moved on to the next new thing. We gave our vote based on the symbol (which often doesn't turn out to be what we thought it was), with little thought to the consequences in the details.
I think it's a problem, how the speed of information is making life more shallow, yet more complicated and less satisfying.
Interestingly enough, this idea (or at least a variation of it) has been around for quite some time. Alvin Toffler[0] wrote about similar ideas in the late 1960's / early 1970's, and a lot of his thoughts are expressed in his book titled Future Shock[1] - which is also the term he coined for the phenomenon he was describing.
Toffler and his work have been discussed here on HN a few times, but I'd particularly call out this discussion, from the article announcing his death.
Future Shock has a mixed reputation, but as I've worked my way through it this year (the 50th since its publication), I've been impressed by both its hits and misses.
The cautionary notes seem more enduring than its optimistic ones, and the commentary on cultural shifts and psychology more accurate than those on specific technical or scientific advances. Even Toffler seems to have underappreciated the scope and impact of infotech change, though that's balanced by overoptimism on other technology fronts.
Summing up his successful prognostications generally, they are "unintended consequences and side effects".
If you like that kind of thing (I do), then you may be interested in A Brief History of the Future, by Jacques Attali, first published in 2006.
Attali was an adviser to the French President and writes in the French obscurantist style, and most of the book is a history of the past, not the future. But...
Attali does talk about the issues of:
everything-as-a-service;
corporate surveillance and hyper-invasive employee behaviour control (think: FB, Goog, MS, your employer with keyloggers etc);
decreasing relevance / capability of leading nation states, especially the US (demonstrated by this little virus thingy);
interregional and inter-ideological conflict (China cyberhacking turned up to 11, climate refugees);
a turning against (crony) capitalism, which has failed for nearly everybody (by externalising costs, and shifting risks onto households). Extinction Rebellion is an early, embryonic example of this, I think.
And he was writing all that back in the early 2000s.
Before the age of corona, I was really getting into spending hours in the reference and old books sections of the library and boggling at things like centuries of german language chemistry journals, shelf spanning encyclopedia, 1970 AI journals, tomes upon tomes of outdated education theory books, multiple different complete histories of European art, How-to's for Kite mounted film photography, contact indexes of professionals and society members, compendiums of historical news papers... Thinking all this information is fossilized here, and no-one is ever going to put it to use again, or even have the attention span to sit and read it, let alone the skills synthesize knowledge from it, or the guidance of a community familiar with the the important texts, and authors to navigation to the good bits..
I was very aware of the difference in my mind now, and when i was younger, when I could just sit and be absorbed by a book, without trying to race ahead.
For what it's worth, people do still enjoy those books sometimes. You did! I spent a lot of time browsing the MIT library for their century-old books on metalworking, aircraft design, electromechanics, and similar topics. A lot of fascinating information is locked up in those pages, in many ways no less useful than what is published today, but much less commonly known.
I think we will get around to extracting it but there are a lot of roadblocks to doing anything with it. A lot of it is IP is a complete mess so even if a company or institution did want to digitize it actually making it available is a complete nightmare, it's ok on '23 to '78 works because it's at least a fixed length but on any works printed after that the period is variable so you have to research when every author died. [0]
[0] Not even sure how that works for things like journals or pieces with multiple authors like scientific papers! I imagine for collected works each piece would be under a separate copyright duration?
I did something similar. I would browse my university’s library and come upon all sorts of interesting topics.
I would still do this at a local library near my office. Unfortunately its closed now with appointment only due to covid.
There are so many interesting bits of information sitting in these books. Its almost as if it’s waiting for someone to stitch together a new idea or theory.
Isn't that the crux of Jean Baudrillard's Simulcara and Simulation? To be honest I've never read the original text, which is not the most accessible, but I've read many overviews and analyses.
When you create a rich enough symbolic system you can detach from what it was intended to symbolize. If the king ordered a map as detailed as possible, as big as the kingdom itself, and dedicated resources to maintaining it, people would start living on the map itself as it becomes less of a map and more of an alternate reality.
The richness of human thought over history is less and less dependent on reality. It will only increase with time.
When a lot of the information is correcting or contradicting the information that came before, you could have reduced the amount of data by slowing down.
> The speed and amount are tightly coupled - it's not possible to process large amounts of data at a slow speed.
That's true (and much the same thought occurred to me), but there is still a difference between bandwidth and latency, and we use "speed" to talk about both.
Latency in its many forms has (on average) decreased enormously as well, whether we're talking about personal correspondence, fashion trends, research and development, stock trading, political speechifying, or practically anything else.
It's probably easier to distinguish between the two aspects of "speed" through their side effects.
The increased volume of information drives narrowing specialization, for example, while decreased latency increases collaboration.
Economically, you can also think of increasing bandwidth as falling distribution costs, and decreasing latency as falling distribution time[0].
So, yes, increased volume of information means that you have to absorb more of it faster even if nothing else changes, but decreased latency makes you, the human in the loop and the frequency with which you poll your information sources, the only remaining bottleneck. It also makes the dopamine hit of immediate feedback apply to ever broader contexts (thus, gamification).
So we get inane phenomena like "First post!", or two dozen voicemails left before lunch because some horrible event in your city made the national news while you had your phone turned off, or flame wars spiraling out of control over the course of a few hours, and so on.
So, yeah. Information coming at you faster isn't just about the amount of information produced or consumed in a given amount of time, it's also about how quickly you then see others' subsequent reaction/response, and how quickly others can see yours[1].
[0] This is complicated by the fact that even here we're talking about both reduced time-to-publication from automation and dematerialization, as well as the falling cost of speed-of-light distribution, getting it into the hands of people everywhere and anywhere, and into (or replacing) every link in the chain.
[1] "Just because you can, doesn't mean you should." definitely applies here, but as elsewhere, isn't necessarily all that helpful.
yes information needs integration which depends on the integration speed of the recipient. let's wait 40 years, future generations will become capable of handling parallel streams of wiki updates without breaking a sweat
The diffusion of technical expertise is an unalloyed good, but the article does not spend enough time exploring the interaction between dispersed expertise and concentrated market power.
There are two very problematic phenomena: (1) plagiarism, and (2) quasi-legal corporate espionage.
In (1) large companies just rip off FOSS code and call it a trade secret, as Goldman seems to have done. [1]
In (2) companies that operate information infrastructure use it to surveill and preempt dispersed talent (as Amazon is alleged to have done with its market data, and Facebook is alleged to have done with its VPN app) [2].
The net effect of all this is suboptimal investment in new tools, techniques, procedures, and business models. You can't invest if you can't profit. A clear market failure demanding public intervention.
Interesting point. Capital (still) trumps knowledge. There is a pop meme that tries to sell "information as the new oil", but your post reminds us that the 'value' of information is akin to 'weight' of a 'mass'. The gravitational field in question is centralized power, market or otherwise.
[personal meta-aside: Did you know that JFK gave that speech to persuade American press to self-censor? That pull quote has been misused rather egregiously.]
Thomas Jefferson drafted the US Declaration of Independence, and was a particularly violent "slave owner" [1]. While context is important, it doesn't mean that a document with an unsavory origin story should be ignored for that reason alone.
[1] scare quotes indicate that I reject the assertion that anyone was ever "owned" by anyone else. Some people were violently imprisoned against their will, but never owned.
I don't think it is just the authors of the US Declaration of Independence who were unsavoury, but also some of the things in the text itself. The most well-known aspect of its unpleasantness is the part about "merciless Indian Savages"; less well-known is its opposition to cultural rights for French Canadians ("For abolishing the free System of English Laws in a neighbouring Province...")
When Britain conquered New France, Britain promised that French Canadians would retain cultural rights, including the freedom to practice the Catholic religion, protection of the French language, and the continued use of aspects of French civil law in the legal system. It is that last aspect of Britain's commitment which the authors of the US Declaration of Independence found so offensive.
(Strangely, when the US agreed to buy Louisiana from France 27 years later, the American objection to the French legal system appeared to have vanished, and Louisiana still partially uses French civil law to this day.)
I had not heard that Thomas Jefferson was particularly violent in his slave holding vs other slave holders. A quick google search isn't informative either, do you have any source for that claim?
Henry Wiencek wrote a book about it. That particular work received intense criticism from other respected Jefferson scholars. It's a debate worth reading.
Frankly it’s a bit of semantic muddle in my own head. For example, is my browsing history “data” or “information”? So for my own personal dictionary, “data” means basically ‘measure’, and “information” is ‘processed data’.
> 2) or this guy used commands that passed the password as an argument (for example "curl -u username:password" maybe svn has something similar?)
Other commands that take password arguments include ldapsearch/ldapmodify/etc (-w argument), mysql (-p/--password option), and Oracle sqlplus (sqlplus user/password@db)
Best practice is that you don't put passwords in command line arguments, and find an alternative method of supplying the password to the utility (such as reading it from a password file, or using non-password-based authentication mechanisms such as Kerberos). Despite that best practice, I'm sure plenty of people still do it.
People also sometimes accidentally paste passwords from a password manager into their shell and they end up in their shell history. This can easily happen if you are meaning to paste it into some kind of interactive command such as ssh or su and the command doesn't execute the way you expected.
If you discover you have passwords in your bash history, editing it to remove them (or even deleting it entirely) is a sensible thing to do.
Shell history was never meant to be an auditing measure. There are products out there that will record SSH sessions (by acting as a recording proxy between the client and the server); if you need a list of commands executed for auditing purposes, those sort of products are the proper answer, not expecting people's shell history to be left intact.
> People also sometimes accidentally paste passwords from a password manager into their shell and they end up in their shell history. This can easily happen if you are meaning to paste it into some kind of interactive command such as ssh or su and the command doesn't execute the way you expected.
That doesn't seem to be case from the article:
> The entire process took about eight seconds. And then he did what he had always done since he first started programming computers: he deleted his bash history. To access the computer he was required to type his password. If he didn’t delete his bash history, his password would be there to see, for anyone who had access to the system.
It seems that deleting his bash history was something _routine_ that he did every time, not some one-off thing to fix an occasional mistake.
> The entire process took about eight seconds. And then he did what he had always done since he first started programming computers: he deleted his bash history. To access the computer he was required to type his password. If he didn’t delete his bash history, his password would be there to see, for anyone who had access to the system.
It seems that deleting his bash history was something _routine_ that he did every time, not some one-off thing to fix an occasional mistake.
This definitely happens! As a non-specific example, there's often an extra step to avoid this by making a "credentials" file with restricted permissions. If you don't do that, you're often stuck passing the password as a commandline argument.
Command line Wifi config comes to mind, but I've encountered this situation in other contexts before. It happens also when you want to connect internet-facing servers. Usually when you're forced to enter the pw via commandline argument.
> We're just starting to understand the implications.
Thirty years after the invention of the printing press, one of the most popular printed books was the Malleus Maleficarum, a treatise on witchcraft which triggered a witch-hunting hysteria that lasted centuries. The corrosive effect of disinformation has been clear for centuries and the medium by which it is disseminated is, in my opinion, hardly the concern. My concern is that until very recently, the controllers of these mediums (e.g. Facebook, Twitter) insisted that all ideas deserve a level playing field, rather than accept that some ideas are simply better than others.
> Before the 19th century, invention and innovation emerged primarily from craft traditions among people who were not scientists and who were typically unaware of pertinent scientific developments.
I think for every pre-19th century innovation that occurred in an "information bubble" there were many more that depended on the Renaissance attitudes toward science and discovery that were themselves predicated on earlier discoveries. If I had to give a realistic estimate as to when innovation truly happened without knowledge of pertinent scientific developments, I would look back to pre-Galilean times.
> insisted that all ideas deserve a level playing field, rather than accept that some ideas are simply better than others
Those of us who insist on a level playing field do so not because we care for "bad" ideas, but because we are concerned about who is going to decide what ideas are bad.
> until very recently, the controllers of these mediums (e.g. Facebook, Twitter) insisted that all ideas deserve a level playing field
This is absolutely not true. You can go ahead and write a full blown, grammatically correct and even narratively interesting novel, submit it to a publisher (owner of printing presses), and they are under no obligation to print your book. Further, that publisher may decline explicitly because they don’t like your ideas.
There clearly are questions around FB/Twitter/Google/etc., but you seriously weaken your point making a claim like that.
Until very recently (as in the last 50 years or so), that's exactly what publisher meant. The people who owned the printing presses. The transition of publishers into pure selection/promotion/distribution businesses is fairly recent.
Except that certain things were illegal to publish, even in a country like the USA with its 1st Amendment rights. And some of the printing presses were fully utilized already, so your pamphlet about the true cost of capital concentration or QAnon wasn't about to get printed. And as for your screed against the owners of printing presses ... yep, you guessed it.
> For better or worse, we can expect further blurring of many conventional boundaries—between work and home, between “amateurs” and professionals, and between public and private.
Recently having a child, I'm definitely finding this blurring of amateurs and pros to be true. There's so many blogs and sites that say different things, it's hard to know what or who to believe. I wish there were just one highly-regarded expert source so that I could just go with what they say and not have to research every little thing.
With rearing a child, as stupid as it may sound, trust your gut. If something feels wrong - it probably is. Other than that - don't worry: the best parent is a happy parent.
I was surprised to find out how many "child rearing" experts do not eat their own dog food or have no children altogether.
Or just in general. Peoples’ “gut feelings” about babies have often historically been pretty bizarre.
Recent example, of a quack medical treatment involving giving babies alcohol:
> Prior to alcohol's removal from the recipe, Woodward's maximum recommended dose of gripe water contained an alcohol content equivalent to five tots of whiskey for an 80kg adult. It was only in 1992 that Britain mandated that alcohol be removed from Gripe water, and in 1993 the United States Food and Drug Administration (FDA) ordered an automatic detention of all shipments of Woodward's Gripe Water into the U.S.
Giving babies alcohol, in various forms and often in quite large amounts, was part of the ‘conventional wisdom’ for a long time (though never really endorsed by medical science). If peoples’ guts were so wrong then, I wouldn’t trust them now.
I feel like the need to elaborate on my original statement proves my point. There are decisions that "by-the-gut" parenting could get wrong with disastrous consequences.
More over, I don't think there's ever a parenting situation which should preclude research, which seems like what the parent OP was advocating for.
I believe the parent is presuming a higher base level of rationality in GP's "gut" than that of the population-at-large. While I'm not a parent, I'm certain that my "gut" would tell me that vaccinating my child is the opposite of wrong, however unpleasant for the child it might seem at the time.
Of course, having watched friends become parents it's also quite clear that becoming a parent changes one's brain in some fundamental ways so it's possible you have a point ;)
If you consider the Bloomberg article on SuperMicro, they repeatedly claim expert/insider sources corroborate their version of the story, and have issued no retraction after their story was thoroughly debunked. While this certainly reflects poorly on Bloomberg, it reflects most on the journalistic standards of the article's authors. I don't think we will ever have a world where authoritative sources exist, since there can always be elements within the source of authority willing to bend the truth for more clicks.
I think there is still something to that story. The authors involved are pretty good journalists from what I hear. I think that the coincidence of the US government's efforts to sanction chinese companies is a little suspect.
Seems an uncharitable interpretation. I would regard that more as a guide to the type of news they want their reporters to focus on - and what their audience wants to read. As far as corporate goals go its actually pretty specific and measurable.
I have this view that the previous era of social structure was crafted through long trickles of actual know-how. This made the authority feels a bit stiff, but with the new open land of interwebs we're seeing how a "fluid" variant is mostly good to drown in.
ps: side note, I tried to leverage access to direct science (pubmed and similar) and I was quite surprised that the struggle you describe for mundane topics (no offense to your family) is similar in deep research. A gazillions of publications all talking about similar things but with various conclusions and a lot of maybes. I kinda saw that human society is that thin autofocus line between high education doubts and high ignorance. We live in the 'field tested' middle ground which is not truer, just more agreed upon.
As another recent parent I absolutely agree, add in to the mix to Google any possible answer you want to see and it becomes pretty useless as a resource. I have given up on the internet for parenting information except for healthcare provider information.
As a parent, there are so many things like "my child is hunching during breastfeeding, what does that mean?" or "why is baby choking during sleep" that you will find absolutely nothing on PubMed for.
Ya, there is so much information on raising kids that is just pulled out of thin air. So I started pretty much ignoring anything I see online unless it is accompanied with studies that are done on largish sample sizes.
> I wish there were just one highly-regarded expert source so that I could just go with what they say and not have to research every little thing.
I find the same with different subjects matters. Something I really needed to dive into, fish farming, in the end resulted in me taking a one year vocational course. That was a good expert source. Another thing I do, is work with recording music, as an amateur. Again a subject with tons of information available, most of it is poor IMHO. I know a vocational course would help, but most materials for this is not available publicly. It frustrates me that a lot of school material, paid for by the government, is behind a gatekeeper. Not sure how to solve that.
Compare this to how it was pre-web: Just in time for the post-war baby boom, Dr. Benjamin Spock's "Baby and Child Care" was new parents' single go-to source of information for decades; selling over 50 million copies, it is one of the most popular books ever. His name recognition was exceedingly high; he'd regularly be mentioned on news and political shows, as well as in jokes on Laugh-In etc.
The point being, you're certainly demonstrating an example of the recent trend where "voice of authority" is giving way to "the very notion of an expert is an elitist concept." Then again, maybe Spock is the explanation of all that ails us boomers?
The printing press and education ecosystem it sparked was way more important for modernity than people give it credit for. Sure, people were motivated to read because they wanted to read the bible, but the skills were most useful for general education and bettering the quality of life.
Imo, one reason Europe is behind digitally nowadays is related to taking the advantage that the printing infrastructure sparked for granted. Information technology and the democratisation of knowledge didn't start with computers, the're several IT milestones that improved the quality of life substantially, from fighting nature for survival and having no distributable time to asking yourself each morning what you're going to learn today.
You can do anything you want, every niche you can find has inward paths that you can follow, and becoming good at something only depends on the time you're willing to invest. Having distributable time and developing skills in areas that interest you is pure luxury.
The trouble with the internet is that it precipitously lowers the barriers for the dissemination of anyone’s half baked ideas.
In the past, being able to widely spread (mis)information was limited by circumstance, access, and most importantly, competence. Incompetent people were self limiting in their ability to spread their typically bad ideas because it was too hard.
Now, any semi-literate buffoon with access to twitter or 4chan can reach millions.
The competence filter is gone and now irrational ideas can spread like wildfire.
Folklore, rumors, and gossip are all as old as spoken language. Just a few hundred years ago, educated people still thought there were witches and magical monsters. Only the medium for story telling has changed, but now with the added benefit of it being a little easier to fact-check if interested.
Things are definitely not worse like you're implying.
I think that the dramatically increased speed and volume cannot be so simply dismissed, however. That’s like saying “what’s the big deal with jet planes? We’ve had boats and wagons since forever”.
Fact checking in particular is less effective since there is such an incentive to push out high volumes of information as quickly as possible. I take the example of writing a book for publication versus a tweet or blog post. The former requires a vetting process before it even starts, takes months if not years to complete, involves several phases of copy editing and technical review from multiple people, and finally there’s a published book at the end. The latter could be a single person banging on a phone keypad for a few minutes, or it could be a team who have extensively sourced over the course of months to produce a well thought out piece. The trouble is- you as a reader have no idea which is which!
I think that we have too much information. It overloads our brains, and so we fall back on our more primitive instincts to cope. That ends up with us simply confirming pre existing biases and giving in to fear and anger. It’s like the thesis in the book “paradox of choice” but applied to information rather than brands of jeans or mayonnaise.
Every time I read something saying that it "it used to be harder" or "it used to be easier" or "it used to be"... I'm like no way, history repeats itself for pretty much every aspect of life.
This article is not about the title, rather it's a survey of societal changes over the last 200 or so years caused by Scientific and Engineering advances.
If you read it with that in mind, it's much more interesting. If you are looking for an answer to the idea in the title, you won't find it.
Can someone explain how the title fits the article? I was .. pretty pissed once I reached the end (almost all of it interesting) to find it's not related to the title at all.
Until three years ago I had a book in my stacks dating from 1962-ish,[1] called The Information Explosion, about how the modern deluge of information is transforming society.
When I saw this, the adjacent story on HN was "Attention is your scarcest resource". That seems to me to be clearly a consequence of unlimited information.
As far as nuclear energy not seeing more widespread adoption, as readers of HN know, the fossil fuel industry is allowed to externalize costs while nuclear can’t as much for a variety of reasons (e.g., high profile nuclear disasters, etc.). If the true costs of fossil fuels could be better exposed, nuclear would be more competitive. That point is sort of tangential to the article but I wish the author had pointed that out in a sentence or two.
Moreover, they gloss over the fact that Nuclear energy, despite its costs, compares very favourably to even renewable energy: compared to wind & solar, it emits several times less CO2, costs several time less (if you account for the batteries needed to store wind&solar energy), and even kills less (per unit of energy).
Not renewable, so it won't last forever. But it sure can dampen the impending long term recession we'll get after Peak Oil.
---
About externalization, it's even worse than most people think: it's not just about making others paying the costs, it's also about delaying payment. For instance, even if we were to stop greenhouse gas emissions right now, dividing our GDP by 2 or more overnight, we'll still get 1.5°C worth of warming in the future, with noticeable consequences already. Of course we won't do that, so the actual warming will be worse. Even more obvious, the depletion of various resources: oil, gas, beach sand, copper…
What we should do, but almost certainly aren't doing, is computing the amortised costs of Global Warming (with population moving because it has become lethally hot in some regions, and lots of other "adjustments"), and the the amortized cost of the depletion of non-renewable resources of all kinds, that we won't be able to use at all at some point in the future; then writing that cost on our balance sheet.
We'd probably notice we were heading to bankruptcy very fast if we wrote that amortized cost down.
“If you account for the batteries needed” - eeeek, this is anti-renewable propaganda. These systems are proliferating all over the world without batteries today.
The scenarios people posit that lead to that kind of propaganda don’t really exist at scale.
"Batteries" was a shortcut for whatever storage mechanism we may come up with. Some of them are promising, and hopefully cheap enough to reduce the cost of "on demand" renewable energy. Right now, though, proper storage makes renewable much costlier than they otherwise look.
Without enough storage, you get energy when you have sun or wind, not when the train must depart the station. So what happens in practice? We use other energies to compensate. Now if renewable end up partially replacing fossil fuels like coal or oil, that's good: every joule we get from the wind, we don't have to get it from coal.
If however you compensate with nuclear plants, that's almost a one sided loss for the environment: you still need to maintain the nuke plants to guarantee minimum available power, but you use them less because renewables offload some energy from them. Carbon wise, this is a pure loss: you end up using more concrete & more steel to get energy from the wind (or the sun), and you don't even shut down nuke plants in return (or if you do, you end up trading nuclear for fossil, which is beyond ridiculous). Moreover, those nuke plants are now less profitable, since we need less total energy from them. If you want to reduce risks, that's not a good idea: less profit means less budget, and maintenance may suffer. The only advantage is a slight reduction in nuclear waste, and those are mostly not a problem (just bury them).
Don't get me wrong, renewable energy is the future, one way or another. Nuclear power is only a stop-gap measure, it won't last forever — just like fossil fuel. Right now though, that's a stop-gap measure I welcome with open arms.
One thing we're going to have to stop externalizing is the cost of change.
The degree of fragmentation, required effort to upkeep etc. all to 'keep up' to a platform offering 'neat' features we don't care about?
Too costly.
We are not accounting for this yet, as soon as you take a second to think in at least 6-year time horizons, it starts to sink in.
It's funny because I used to love 'new tech', and now I'm instinctively against it, knowing nothing about it other than 'not standard' is enough to make me feel the amount of extra labour required to deal with it. I like movements such as 'Rust' whereby we see a legit challenger to a foundational tech, but with enough enthusiasm and long term vision that one day, it will be a possible 'off the shelf choice'.
Better title: "Where are all the flying cars?" A 3500-word survey of what
worked (transistor-enabled computing and the internet) and what didn't (space
travel and nuclear) post-WW2. As an alum of the author's employer, I never
thought much of that "field of concentration" known as HistOfSci. Matt Damon's
observation about library late charges was spot-on.
Every day we're being asked (or exposed) to so many new topics (and movements) that our attention has to get divided for minutes at a time amongst so many things.
And what does that lead to? Given so many things to pay attention to, we then have to resort to simplification -- symbols -- to decide what we want or not, what to vote for or against, when there's a world of complication underneath.
We go from symbol to symbol, giving our upvote that causes huge amounts of disruption to the people and issues that have to deal with the day-to-day reality after we've moved on to the next new thing. We gave our vote based on the symbol (which often doesn't turn out to be what we thought it was), with little thought to the consequences in the details.
I think it's a problem, how the speed of information is making life more shallow, yet more complicated and less satisfying.