I think it's due to a general decline into literalness.
I'm not sure which came first: audiences that no longer understand symbolism, metaphor, allegory, or writers who no longer use it. In any case, all of these things are basically completely absent from any modern piece of mainstream media. Wherever there's an attempt, it's decidedly conspicuous. There's little nuance and subtlety.
I vaguely recall some sociology and media theory strands that make arguments similar to the quoted post—that we are entering or have already entered an era of post-literacy. Our new language is a language of images, (tiktok, instagram), immediacy, and literalness (does anyone even understand allegory anymore? Does the average piece of media ever express a metaphor?). I don't have numbers on it, but my teacher friends tell me that the typical student's reading comprehension skills have tanked in the past few years.
It's not just reading comprehension, it's the imagination that goes with it.
Text is active. It triggers the imagination. Visual imagery - especially electronic imagery - is consumed passively. What you see is what you get.
Especially with Gen Z, there's been a catastrophic collapse in the public's ability to imagine anything that hasn't been pre-digested by Hollywood movies, video games, D&D, and anime.
It's the same stock imagery over and over and over.
Older culture is "boring" because it doesn't follow the standard tropes, and that makes it incomprehensible.
It's a bizarre kind of deja nostalgia - the only futures that can be imagined are the ones that have been imagined already.
Older works often conform less closely to the tropes, because the tropes weren't established at that point. When a medium or genre is new it often goes through a Cambrian explosion phase where there are all kinds of wildly varied pieces, and then things settle down and coalesce on the specific approaches that have proven successful.
I wonder if people are downvoting you in good faith because I think you're on to something. My assumption is that denigrating mass media and pop culture comes across as "elitist".
Oh well. I mean, for the person who can look around and feel disdain toward these things, they deserve whatever shred of dignity the allegation subscribes them to.
"Second Order Illiteracy" is precisely what cripples imagination, or the ability to perceive things beyond the immediate senses. Passively consuming electronic media does the heavy lifting that the literary mind achieves.
> It's a bizarre kind of deja nostalgia - the only futures that can be imagined are the ones that have been imagined already.
If we toss the word "capitalism" into the fray of what you're saying I think this is what Mark Fisher meant by the "Slow cancellation of the future".
Possibly the idea is just too new / people who haven't seen it think it's just dunking on the young generation again. But for example there's an unexpected trend on social media just within the past month of a large amount of Gen Z not being able to read "third person omniscient" (a term I hadn't heard before but is pretty much just what it sounds like; from examples appears to be how all fiction I've read is written).
A friend of the family gave my son a guitar a while back and more recently tried to get him to play Sugar Mountain by Neil Young. He worked at it pretty hard and struggled with it because even though it is simple it has to be played with great precision to sound good. Then he discovered grunge and bar chords and had a breakthrough with The Day I Tried to Live by Soundgarden and Rooster by Alice in Chains.
Now he's looking for good songs he can play and that's gotten him into David Bowie songs from The Rise and Fall of Ziggy Stardust and the Spiders from Mars. For a long time I thought of David Bowie as one of those classically trained musicians like Frank Zappa who played rock because it had commercial potential, but he found many songs on that album to be great songs that were within his reach. Now when we have houseguests who say they like Rush he will be able to play the chorus of a few songs in 24 hours and he's building instruments like a Guitar-harp-ukulele (fretless guitar with two bridges, one of which has a harp section) and he's asking me about the physics to build a bass guitar tuned an octave or two below a regular bass guitar.
> he's asking me about the physics to build a bass guitar tuned an octave or two below a regular bass guitar.
I barely know anything about music, and probably less about guitars, but if he can do barre chords, then you can try to build a simple capo with him, since he might readily grasp the utility of having a clamp that essentially gives you another hand on that side of the guitar.
He's been experimenting with clamps, he has one for the fretless guitar section of the guitar-ukulele.
As for the electric quadro- or octo-bass the variables you can tweak are:
* length
* mass/length
* tension
There's some limit to how long you make the strings or you can't play it or otherwise you need something to extend your reach like the levers on the octobass. The other two are inside a square root which is not in your favor. Probably the easy thing to do is find some really heavy strings for a normal bass and see how low you can get the tension.
But really he's the one to build things. Back when I was in physics they kept trying to get me to do experiment rather than theory, if I have any regret it is that if I had studied experiment I'd be able to build all the things that my son wanted to build but, hey, he can build those things now.
There was once a company making a bass ukulele with inch-thick polymer strings. Playing them was kind of hilarious, and would be fun to see it scaled up to the size of an actual bass guitar... Probably someone has done it...
Life is the ultimate test. We see ourselves reflected in the eyes of others. There may be nothing more humbling than being confronted with our own ignorance, except when we have an audience.
I guess once the strings are too loose, then they can’t vibrate consistently enough for long enough to be tunable/playable? I am wondering if a kind of lap guitar or a guitar laid flat might allow for pedals to be used that could bisect the strings to do octave changes upward in pitch. Going downward in pitch from an open position is going to be hard unless you have some excess tunable string beyond the last point of contact with the strings, and that contact could be released to increase the string length?.
You might be able to find an 8 string bass, and have two different string gauges. The top four could be heavier gauge and tuned at a lower octave. Or you could alternate gauges and silence the strings? I don’t know much about playing technique, but it sounds like it might be hard to build in such a way so idiomatic playing technique and style is preserved, but many alternate tuning methods and tools do affect how the guitar is played, so that may not be such a big deal if he’s the only one playing it, but if he wants the mechanics to translate to playing other guitars, those concerns might be more relevant.
It might also be possible to teach him how to build simple guitar pedals, which can easily pitch bend in post-processing once you know how the parts fit conceptually together.
Your guitar projects sound interesting and would be a good post for HN if you can find the time.
I would not say that the value of art is strictly equivalent to technical difficulty. But I would say that there is a level of technical competence required for art to be good. Something that takes no skill to create (e.g. that absurd banana duct taped to a wall "piece") is not good art, if indeed it can be called art at all.
The fact that people still talk about it and ridicule it 6 years after it was created, and it lives on in the cultural zeitgeist as that, makes it good art. It's literally called Comedian.
I would argue art is not about how "good" it is, but rather how it makes you feel. And the duct tape banana, just by referencing it, is successful in making you feel something.
Right. It's just like music. Some people can appreciate noise music, some people view it as just that: abrasive noise. It's a matter of taste. For some, the unpleasantness is of aesthetic interest and they have an aesthetic appreciation for it.
Great example. It is absolutely a matter of taste. Sadly noise music doesn’t support a lot of full-time jobs compared to writing songs more or less the way the Beatles did. Which was all new and stuff, but not completely foreign to what Stephen Foster did.
We can try to reinvent writing, or we can focus on writing. But one may come at the expense of the other.
This is fundamental misunderstanding of literature.
This is like saying to a musician: I like the melody but you chose all the wrong instruments.
Obviously, the entire character of a song depends not only on the melody (idea) but also on the instruments chosen, the performance, etc. (material).
For literary fiction, the words are the material. What distinguishes literary works is not merely the "ideas" they present but the way in which they are presented. The words are the author's instruments, his paints. This is the difference between writing/reading for information and writing/reading as an aesthetic experience. Literary fiction of course imparts information and ideas, but it is predominantly about the latter experience insofar as the point is the evocative expression of those ideas.
This is why just reading the cliff notes for a literary work is missing the point.
Actually, I'd agree that "fundamental misunderstanding" is too strong. Obviously there is a certain threshold of comprehensibility one needs to achieve regardless of whether one is pursuing aesthetic ends or informative ones.
That said, I would stand by the assertion that reading literature only for the information it imparts is missing much of the point. We esteem authors not solely for their plots and characters, but also for their stylistics—the difference between a great writer and a passing one is often little more than the well considered phrase. The arrangement, use, and rhythm of words are a major component in a literary work.
My point is that asking a writer to "express it more simply or more accessibly" may in many cases amount to asking them to butcher the stylistics that they felt achieved the highest aesthetic quality for the kind of work they wanted to produce.
If one is given a business briefing it is probably the apex of reason to ask a writer to simplify. Are there cases in which this or that phrase in a literary work would benefit from simplification? Yes, but to ask an author to simplify their entire aesthetic approach generally, really seems to me to fail to have appreciated a large part of what distinguishes literature from basic expository writing.
I’d agree with your original assertion. Liking the content but not the form is like looking at a Turner and wishing the ship were closer and he’d chosen a clearer day, or thinking that Monet had some nice flowers in his garden but you’d like him to have painted them more clearly to be sure.
Faulkner,
Thomas Bernhard,
John Barth,
Henry James,
Herman Melville,
Fleur Jaeggy,
Dostoyevsky,
Marguerite Duras,
Poe,
Hawthorne,
Rosemarie Waldrop,
Kraznahokai,
These are just a couple that came to mind. Among them, probably Waldrop, Jaeggy, and Bernhard are the most experimental, but I would argue that none of them aesthetically speaking write books that are simple, and I don't think I could argue that any of them should have simplified their themes or style or general employment of language to be more accessible.
Kraznahorkai and Bernhard are great examples. Are walls of text without paragraph breaks harder to read? Yes. But this is an important aesthetic choice. In both cases (all of bernard, melancholy of resistance for Kraz) it speaks to an overbearing oppressiveness that ties directly into their thematics. If you missed this I think you missed out an essential point of their aesthetic and what they were trying to say. We cannot sever form and content. This is why I think it's absurd to complain that someone's work is "not accessible" —its really silly to demand any sort of aesthetic capitulation on the part of any artist, literary or otherwise, in the first place.
Edit: Faulkner is another good example that's less experimental. I'm sure some readers would have found As I lay Dying or The Sound and the Fury more accessible if a narrator mediated between the various first person voices he presents, but this would so drastically change the aesthetic character of these works that I doubt you'd be able to claim they aren't essentially different and would not be equivalent pieces of art.
I think these are factors to the extent that one sort of needs formal training and schooling in the historical development of the form to appreciate experimental and more contemporary work. The same can generally be said about visual art.
Because of that, yeah, hyper-specialization in schooling and a general movement toward stem means that a lot of people don't actually acquire the requisite background to engage with and appreciate modern work in a sufficient way—just like someone untrained in computing probably would not have an easy time understanding or appreciating significant breakthroughs in computer science.
I think the analysis of the declining pipeline is spot on. Up until around 2016 or so, I was on track to try my hand at the world of literary fiction—I had participated in several circles in college, sat on the review board of a lit mag, joined a group of writers post graduation, all just to eventually...set it aside.
I always had a "day job" during this time, but other than that I was single and had few responsibilities. This made holding a typical nine to five and actually getting some writing done somewhat tenable.
As my sphere of responsibility expanded (relationships, etc) this quickly became untenable. There's only so much time in a day, unfortunately, and as we continue along a career path, we're incentivized to invest ever increasing amounts of time into that, rather than a far-from-lucrative gamble on literary pursuits.
When you're able to actually make money on your literary work, it establishes a virtuous circle. Writing more makes you a better writer and writing more gets you paid (allowing you to support those other aspects of your life). Contrast this with the modern experience of desperately trying to carve out whatever time you can to make at least a brief writing session happen, amidst being exhausted already by the other demands on your time (your non-writing job being a big one).
From the critical side, I think the situation is pretty much analogous to that of contemporary art. The common person would meet most experimental literary works with a quizzical look, just as they meet most contemporary and conceptual art with the a quizzical look. Artists, however, have had better success with this because their objects are not generally mass produced. This has allowed the critical narrowing and distance from the common taste to be buoyed up economically by natural scarcity and the concomitant transformation of the object into a value-holding asset. That can never happen with literature, which is definitionally reproduced at scale.
Parasites. Period. This thing fills my head with a bunch of images from distant warped realities, summoned up out of thin air, it clogs my brain with a flood of other voices, it keeps me moving constantly from one mental landscape to the next.
It is precisely the fever dream experience of a parasitic host as the fiend sucks the life from it.
At this point I’m pretty convinced that social media is a net negative for humanity with little redeeming quality.
There was a time when it did connect people, like you could meet people or find old friends, but that was long ago deprioritized or even stripped away. Now it’s just a pure chum feed that serves up either brain rot trash or political fear and rage bait.
TV was always full of crap but it also gave rise to great shows, to art and lasting culture. It was a medium of mixed value. Social media doesn’t even have that going for it unless some day people celebrate the great Pepe memes. Everything it creates is disposable low effort trash. Nothing worth keeping. You could delete it all and everyone would forget a week later.
Just be aware that people spend more time on HN than they really want to, HN even has a no procrastination setting where you block the access after some time spent here. So, aside dark patterns which HN does not have, it seems that the amount and frequency of information that cause some kind of dopamine disregulation.
>Parasites. Period. This thing fills my head with a bunch of images from distant warped realities, summoned up out of thin air, it clogs my brain with a flood of other voices,
It does all that by it self? Of course not. This thing is not autonomous. Every app that _fill your head_ was your choice to install, signup and OPEN every time. It's like saying drugs are parasites. Of course they aren't. Your choices made you addicted to the thing. What happened to agency? Everyone is now a slave of social media apps?
As for bloatware-free phones: Fairphone. I've been using one for more than 4 years now, replaced the screen after dropping it, and I bought an extra battery to keep my gps alive on long ski trips.
Hey mate. I don't want to be patronising, but it seems like you're suffering a bit and might benefit from some professional help. I get it, phones can become a stand in attachment figure for people who are experiencing personal difficulties. There are ways to overcome this but it requires professional expertise. I hope you can get some relief from this. These technologies are omnipresent in modern life and for many of us it isn't a simple matter of choosing to disengage. All the best.
I'm not being entirely serious. I was actually just having some rhetorical fun with the analogy, but it's sweet that the HN crowd is so quick to move into the mode of concern.
But it's not an autonomous entity. It's entirely passive if you turn it off, or uninstall all the apps that bother you. Stop feeding it your attention. Or are you claiming it's taken over your agency as well? If so -- and I mean this as respectfully as possible -- I'd hope you decide to talk with a healthcare professional about this problem.
I used to think the same when people spoke of social media being addictive, but not any more. Drugs aren't autonomous either, however they clearly do remove some people's autonomy. I can well imagine now that a person could be addicted, particularly if he's grown up with it and has used it in the past to avoid facing his problems.
“Thus, a good man, though a slave, is free; but a wicked man, though a king, is a slave. For he serves, not one man alone, but what is worse, as many masters as he has vices.”
― Augustine of Hippo
(Not saying anybody around here is wicked just remembering a striking quote)
To add to your point, there's science behind it. What makes a drug addictive is the fact that, after you start using it, certain behavioral patterns move from conscious brain centers to habit driven centers. That's what makes addiction so hard to break, couple that with further biological dependence formation.
I would not be at all surprised if the same phenomenon can happen with internet addiction, if maybe in a more benign form as far as bodily consequences go.
So no, drugs are not "autonomous" but to act like it's an easy matter to simply refuse to do them once you've started is an attitude that's shockingly ignorant of much of the modern understanding of addiction.
I didn’t claim it was easy. I even advice getting professional help if the situation is so bad that the person’s own willpower is not enough.
Drugs have additional chemical and physiological impacts that make them a very different beast, but obviously addiction can still be brutal even if it’s “just” an experience — look no further than gambling. For which most countries have strict laws and regulations, particularly when it comes to children.
But let’s also not infantilise adults: People need to take responsibility for their own behavioural patterns.
You know that is you, not the phone, that is desperately looking for a distraction and a different reality.
I don't understand why it's ever so trendy to do this performative i-have-minimal-agency-in-my-existence bit when it comes to smartphones; swap that last word with "TV" and the same people professing it would smirk - presumably because that's an old timey, rather than a très trendy, thing to be part of.
Pacifiers. For a while I watched dash cam video from police car chases. I'm over it now, but one interesting thing frequently happens: officers yank the drivers out, get them on the ground and cuff them. In most cases, the drivers are yelling about their phone. "My phone!" "My phone!" Cars is wrecked, they've got knees pressing them into the dirt, frequently injured, facing all manner of charges, and the one thing they care about it some phone.
After sitting in the patrol car for a quite while, some officer opens the door to speak with them. Immediately: "Can I get my phone??"
Sometimes they'll still have a phone after being cuffed and detained. They will nearly dislocate joints to thumb their phones continuously while cuffed. They will not stop until someone takes it.
This aspect of the industry really annoys me to no end. People in this field are so allergic to theory (which is ironic because CS, of all fields, is probably one of the ones in which theoretical investigations are most directly applicable) that they'll smugly proclaim their own intelligence and genius while showing you a pet implementation of ideas that have been around since the 70s or earlier. Sure, most of the time they implement it in a new context, but this leads to a fragmented language in which the same core ideas are implemented N times with everyone particular personal ignorant terminology choices (see for example, the wide array of names for basic functional data structure primitives like map, fold, etc. that abound across languages).
If you find that you're spending almost all your time on theory, start turning some attention to practical things; it will improve your theories. If you find that you're spending almost all your time on practice, start turning some attention to theoretical things; it will improve your practice.
But yeah, in general I hate how people treat theory, acting as if it has no economic value. Certainly both matter, no one is denying that. But there's a strong bias against theory and I'm not sure why. Let's ask ourselves, what is the economic impact of Calculus? What about just the work of Leibniz or Newton? I'm pretty confident that that's significantly north of billions of dollars a year. And we what... want to do less of this type of impactful work? It seems a handful of examples far covers any wasted money on research that has failed (or "failed").
The problem I see with our field, which leads to a lot of hype, is the belief that everything is simple. This just creates "yes men" and people who do not think. Which I think ends up with people hearing "no" when someone is just acting as an engineer. The job of an engineer is to problem solve. That means you have to identify problems! Identifying them and presenting solutions is not "no", it is "yes". But for some reason it is interpreted as "no".
> see for example, the wide array of names for basic functional data structure primitives like map, fold, etc. that abound across languages
Don't get me started... but if a PL person goes on a rant here, just know, yes, I upvoted you ;)
[0] You can probably tell I came to CS from "outside". I have a PhD in CS (ML) but undergrad was Physics. I liked experimental physics because I came to the same conclusion as Knuth: Theory and practice drive one another.
I get weird looks sometimes lately when I point out that "agents" are not a new thing, and that they date back at least to the 1980's and - depending on how you interpret certain things[1] - possibly back to the 1970's.
People at work have, I think, gotten tired of my rant about how people who are ignorant of the history of their field have a tendency to either re-invent things that already exist, or to be snowed by other people who are re-inventing things that already exist.
I suppose my own belief in the importance of understanding and acknowledging history is one reason I tend to be somewhat sympathetic to Schmidhuber's stance.
I'm in the same boat. At least there's a couple of us that think this way. I'm always amazed when I run into people who think neural nets are a relatively recent thing, and not something that emerged back in the 1940s-50s. People seem to tend to implicitly equate the emergence of modern applications of ideas with the emergence of the ideas themselves.
I wonder at times if it stems back to flaws in the CS pedagogy. I studied philosophy and literature in which tracing the history of thought is basically the entire game. I wonder if STEM fields, since they have far greater operational emphasis, lose out on some of this.
> people who think neural nets are a relatively recent thing, and not something that emerged back in the 1940s-50s
And to bring this full circle... if you really (really) buy into Schmidhuber's argument, then we should consider the genesis of neural networks to date back to around 1800! I think it's fair to say that that might be a little bit of a stretch, but maybe not that much so.
> Around 1800, Carl Friedrich Gauss and Adrien-Marie Legendre introduced what we now call a linear neural network, though they called it the “least squares method.” They had training data consisting of inputs and desired outputs, and minimized training set errors through adjusting weights, to generalize on unseen test data: linear neural nets!
... except linear neural nets have a very low level of maximum complexity, no matter how big the network is, until you introduce a nonlinearity, which they didn't. They tried, but it destroys the statistical reasoning so they threw it out. Also I don't envy anyone doing that calculation on paper, least squares is already going to suck bad.
Until you do that, this method is version of a Taylor series, and the only real advantage is the statistical connection between the outcome and what you're doing (and if you're evil, you might point out that while that statistical connection gives reassurance that what you're doing is correct, despite being a proof, points you in the wrong direction)
And if you want to go down that path, SVM kernel-based networks do it better than current neural networks. Neural networks throw out the statistical guarantees again.
If you want to really go back far with neural networks, there's backprop! Newton, although I guess Leibniz' chain rule would make him a very good candidate.
Another interesting thing I see is how people will refuse to learn history thinking it will harm their creativity[0].
The problem with these types of interpretations is that it's fundamentally authoritarian. Where research itself is fundamentally anti-authoritarian. To elaborate: trust but verify. You trust the results of others, but you replicate and verify. You dig deep and get to the depth (progressive knowledge necessitates higher orders of complexity). If you do not challenge or question results then yes, I'd agree, knowledge harms. But if you're willing to say "okay, it worked in that exact setting, but what about this change?" then there is no problem[1]. In that setting, more reading helps.
I just find these mindsets baffling... Aren't we trying to understand things? You can really only brute force new and better things if you are unable to understand. We can make so much more impact and work so much faster when we let understanding drive as much as outcome.
I think you should have continued reading from where you quoted.
>> Aren't we trying to understand things? ***You can really only brute force new and better things if you are unable to understand. We can make so much more impact and work so much faster when we let understanding drive as much as outcome.***
I'm arguing that if you want to "deliver business units to increase shareholder value" that this is well aligned with "trying to understand things."
Think about it this way:
If you understand things:
You can directly address shareholder concerns and adapt readily to market demands. You do not have to search, you already understand the solution space.
If you do not understand things:
You cannot directly address shareholder concerns and must search over the solution space to meet market demands.
Which is more efficient? It is hard to argue that search through an unknown solution space is easier than path optimization over a known solution space. Obviously this is the highly idealized case, but this is why I'm arguing that these are aligned. If you're in the latter situation you advantage yourself by trying to get to the former. Otherwise you are just blindly searching. In that case technical debt becomes inevitable and significantly compounds unless you get lucky. It becomes extremely difficult to pivot as the environment naturally changes around you. You are only advantaged by understanding, never harmed. Until we realize this we're going to continue to be extremely wasteful, resulting is significantly lower returns for shareholders or any measure of value.
I haven't read the article or paper yet, but if the gist I'm getting from the comments is correct, Schmidhuber is generally correct about industry having horrible citation practices. I even see it at a small scale at work. People often fail or forget to mention the others that helped them generate their ideas.
I would not be at all surprised if this behavior extended to research papers published by people in industry as opposed to academia. Good citation practice simply does not exist in industry. We're lucky in any of the thousand blog posts that reimplement some idea that was cranked out ages ago in academic circles are even aware of the original effort, let alone cite it. Citations are few and far between in industry literature generally. Obviously there are exceptions and this just my personal observation, I haven't done or found any kind of meta literary study illustrating such.
I'm not sure which came first: audiences that no longer understand symbolism, metaphor, allegory, or writers who no longer use it. In any case, all of these things are basically completely absent from any modern piece of mainstream media. Wherever there's an attempt, it's decidedly conspicuous. There's little nuance and subtlety.
reply