Hacker News new | past | comments | ask | show | jobs | submit login
Not all young people are ‘digital natives’ (theconversation.com)
157 points by thg on March 14, 2020 | hide | past | favorite | 157 comments



I changed my business recently to a situation where I deal with alot of millenial aged college graduates. For all the talk of this generation "growing up with the internet", I am consistently surprised by how poor their non social media skills are. Things like copying files from a thumbdrive to a computer, filling out a spreadsheet, adding a header or footer to a document - sometimes it is quite shocking.

I graduated from High School in 1994 and had some computer classes leading up to that. I learned how to type a letter in wordperfect. I learned how to save it to my floppy disk. I learned how to use a spreadsheet to make a budget.

What has happened that we don't teach these basic things any more? Social media is important as a medium but basic job skills are really being left to the employer to teach.

Of course, it could be the kind of young person I am contracting. But they are finance majors, political majors, and so on.


You want to hear another one ?

I am working with a digital native. This week we are supposed to write a small blog post about collaborative tools. He joined the team 9 months ago and it's the first time we ever work on something together. We share a google drive with 3 other members. I ask him to create the doc, we'll start working on it.

He creates it, starts working on it then shares the link in messenger.

Link doesn't work, I don't have access. Curious, as I am the one who created the drive. Well, I ask him to add me to the collaborators (maybe he created without sharing). He can't, doesn't find where to update the doc permissions.

I understand he created the file outside our shared folder. He doesn't get how to move it from his folder to our folder.

He's 23, fresh out of uni., working as a marketing/digital consultant, writing a blog post about collaborative tools but can't figure out how to share a doc.

I concede google drive UI is a trainwreck but still.


I'm convinced this kind of thing will become more common, because so many people use mobile devices as their primary/only computing device.

They have very little concept of filesystems, directories, or even files. For them data lives inside apps, each app only lets you save data inside that app in a predefined location, and moving data between apps is done by hitting the "share" icon.


Growing up with the social implications of the internet isn't the same as growing up with PCs. Smartphones are now what people know how to use and you don't interact with them the same way you interact with a desktop PC. Files? What are files.

There was a time before technical literacy and now we're in a time after it. It was fun while it lasted.


Smartphones offer a much more user friendly, accessible way of doing common tasks, but there is a much higher barrier of entry to go from "software user" to a hobbyist "software creator".

Compare to microcomputers that might be sitting in primary school classrooms 25-30 years ago, where you turn the machine on and seconds later you're staring at a command line that directly interprets BASIC.


I worry this will have an effect on equality of opportunity in the not-too-distant future. There will be some parents who understand this, and some who don't - or can't afford to do anything about it. It's a big swath of options those people will miss out on.


> There will be some parents who understand this, and some who don't

In some ways it is easier to understand (software / IT is perhaps more widely understood to potentially be a good career, given the success of tech companies as businesses & the financial success of many of their employees) but also harder to understand -- there are fewer obviously "tinkerable" computer systems in people's day to day lives.

> or can't afford to do anything about it

I agree that wealthier families generally have access to more opportunities to set their children up to have successful careers (e.g. private tutoring, wider choice of schools, able to support children to focus on careers with high initial investements of time and capital vs needing to leave school to help out on the family farm, etc). This has been true for a long time and will continue to be true in future.

But, I reckon that it is much more affordable to buy hardware & software required for a "tinkerable" hobbyist computer lab these days than 25 years ago: what cost $2000 in 1995 might only cost $200 today (esp if you buy second hand hardware and install linux) while also giving far superior hardware performance and better tools for creating software.


Wealthy individuals and their children are probably more likely to have social connections to others with capital who are willing to assist them as well.

A raspberry pi 4 kit with 2GB of RAM can be as low as $60 now!! Tinkering with computers can be inexpensive. For those who want it, there is a cornucopia of tinker-able machines out there. Many smart phones have terminal apps you can download. There are also a lot of websites out there that’ll let you do something akin to tinkering.

It seems to me that, in many cases, it’s a matter of getting people’s attention or interest in tinkering with their computers than access to tinker-able machines.


Also, you can pick up an old computer from a thrift store or craigslist for really cheap -- much cheaper than a pi.


Absolutely. A Pi is a luxury item for bored programmers.

You can get much more bang for your $60 buying a PC/laptop.


Its also a barrier to working in a professional job in a lot of cases.


Millenials did not grew up with smartphones, but PCs. Well, at least if their familiy could afford them. But having access to the same tools, does not implicate to use it for the same tasks.

Kids would use them for fun, work comes later. And it seems people take it for given that digital natives not only have no fear from technology, but also are on the same basic level as a proper trainied worker, because magic?


> I learned how to save it to my floppy disk.

All Gen Z "digital natives" that I met they never seen a real floppy disk, except as en emoji.

> Things like copying files from a thumbdrive to a computer, filling out a spreadsheet, adding a header or footer to a document - sometimes it is quite shocking

When there is a clear pattern of how the majority of consumers/users behave with a product, then the issue is with the product, not the consumer.

One day my Gen Z relatives complained to me about their iPhone battery draining so fast, I checked their iPhone Settings and was shocked at how iPhone default settings were set to suck up their personal data all the time. And quite shocked that they didn't know how to turn it off.. Settings > Privacy > Location Services > System Services > Significant Locations & Product Improvement (turned off)

I'm a millennial, who used PC at times when the ONLY notification was the Error message. Even for OS updates you had to buy PC magazine to know that there was a new OS release... There was no internet, and the PC was so quiet...As a child of the 90s I had to navigate through every file because I had nothing else to do.

Being curious about every function of the software doesn't seem to be a thing of the new generation...surprisingly, this generation, Gen Z, knows more about Social Media Settings than their device Settings.

It's like our generation, the millennials, drove the manual car before we started driving the automatic car....Gen Z, digital natives, started driving Tesla as their first car..

I always remember this quote by Andreessen:

"The spread of computers and the Internet will put jobs in two categories, people who tell computers what to do, and people who are told by computers what to do."

http://usatoday30.usatoday.com/money/business/story/2012/09/...


I’m not sure a lack of curiosity is so much a generational thing. Some people just don’t really care, they learn what they need to get done what they want and don’t poke it any further.

That’s not really even a bad thing exactly. I’m reminded of the difference in how my dad and I treat cars: I just want to get where I’m going, but he is so into all the little details of driving and the mechanics of it all. He changes his own oil, tweaks his car’s computer system for performance, etc. He is always trying to get me to care about it all, but I frankly don’t care if my oil is a bit old or if I could change it myself or whatever. I’d rather never have to drive at all if I was rich enough to have a private driver do it for me. Anyway whenever I get frustrated that my family or friends don’t seem to take something computer-related seriously or actually try to figure out how to do it right, I stop and think about how frustrated my dad gets about my attitude towards cars.


> I’m not sure a lack of curiosity is so much a generational thing. Some people just don’t really care, they learn what they need to get done what they want and don’t poke it any further.

Basically true, but on the other side there is today so much to poke on, so much you can't understand and control..I would say to some point even the curious ones stop to poke deeper very fast. And over time this creates bad habits.


I buy your analogy and it resonates with me. But I was referring to our "personal data".

The car transports us from one place to another without taking/owning a piece of us.

Smartphones take a piece of us, our memories and what shapes our identities...all that make us who we think we are.

"Personal Identity depends on consciousness not on substance” ― John Locke


Many people see no real difference. They don't really care that their personal data being used, because it's so ephemeral, untouchable.


> All Gen Z "digital natives" that I met they never seen a real floppy disk, except as en emoji.

The concept is still around, just the technology changed. Today it's thumbdrives or some other external drive. Someone not knowing how to operate them today, would they be considere as a digital illiterated?


I'm a high school teacher, so focusing on a younger age group that your demographic, but they absolutely don't teach kids these things. The only class we have with a computer is a typing class that lasts 12 weeks where they essentially play games online (the teacher is also a nutjob; anti-vaxer and flat-earther!)

These kids type their entire papers on their phones and save it via Google Drive. It's not surprising they really don't know how to do anything with computers.


They don’t use google docs?


Drive and docs are tied together.


> These kids type their entire papers on their phones

Um... what? This would blow my mind to observe.


Millenial here, and because I spend pretty much all my time on my phone when I'm not programming, I sort of feel a bit like a gen z'er.

I do a number of things on my phone that would be faster and 'easier' to do on my mac, but which I don't simply because doing it on my phone is more comfortable/natural.

Yeah, it's weird. However, my phone has become a part of me (in a way). Doing things without it feels unnatural.


>It's not surprising they really don't know how to do anything with computers.

but... they know how to type a paper and save it to Google drive.

I'm not trying to be pedantic but that's how computers are used nowadays.


Being able to sufficiently handle technology can be a matter of learning a trick and sticking with it. A difference in mentality or 'skills' is the ability and curiosity to learn new tricks when a situation calls for it. So it's not really a generational issue (now that I think about it, generational issues are not really related to the 'quality' of the people in it, but the environment that shaped them - heh), it's a mentality/skill issue. The will to actually solve the problem in front of you is a skill that can be learned, but if everything around you 'just works' you never have the need to develop that skill.

You've been happily using your Generic XPad your entire life dealing with minor UI changes that are explained to you via small tutorials or w/e. You learn your job on the job, specific things in school etc. The furthest you'll go is the search bar on Facebook or a marketplace.

If you encounter something that doesn't just work, how do you know how to figure it out if you've never developed that skill. To me and you it's not that hard, it's pretty obvious.

People just learn the trick, but not how to be curious. Or maybe they just don't care, that's fine too I guess.


>If you encounter something that doesn't just work, how do you know how to figure it out if you've never developed that skill.

They'd probably just Google it, and that would probably be sufficient.


And Google is pretty much how I fix any technical problems given to me in my household.


> I changed my business recently to a situation where I deal with alot of millenial aged college graduates.

Are you talking about fresh college graduates? Or actual millenials which are now a decade later encountered with you? In later case, what kind of business do you have to specifically encounter this age-group?

> Things like copying files from a thumbdrive to a computer, filling out a spreadsheet, adding a header or footer to a document - sometimes it is quite shocking.

You don't learn them from typical private daily activity, you easly forget them when not regulary trainend.

> I graduated from High School in 1994 and had some computer classes leading up to that. I learned how to type a letter in wordperfect. I learned how to save it to my floppy disk. I learned how to use a spreadsheet to make a budget.

I graduated some years later than you and had those lections and several more office-stuff even after school. But I could still not do it beyond the most basic level without googleing it. If you don't refresh those lections regulary you forget the details quite fast.

Every now and then I do some more complex formulars in excel, and still need to figures for some minutes how the most basic referencing work or which formular I'm supposed to use. And I'm a software-developer, doing more complex things everyday.

Random complexity is a hindrace for everyone.


On their defense all you wrote about could be seen as antiquated:

- header/footer: good modern UX always has action and effect close together, so Ideally you'd just click that area, not fiddle with nested menues.

- files: it's cloud time, look at files on IOs. It either lives an the app or goes through "share". And while I personally hate it it's actually a fair thing to have it that way.

It's just professional tools that have to be complex because they are more general and powerful. Consumer apps don't have to be any more and that makes them diverge from the pro tools further than ever.

We were just early enough to be forced to learn complex and relatively low-level interaction with our computers. That's great for understanding and using professional tools, but I'd argue it's not necessary any more to get by digitally. And even with the professional tools they are often just a bit behind in UX.

I think drawing a parallel to low-level programming languages makes sense. If you grew up with a lower level you'll have learned a ton of advanced stuff, but it doesn't make sense to force CS students to still be fluent in assembly(?) or whatever.


You make some really strong points.


Many years ago a lot of people bought into the myth that because young people grew up with computers they would naturally have these skills and wouldn't be intimidated.

I believe the percentage of people who are terrible with computers hasn't really changed that much. People are much more exposed to computers but I swear some people cannot think rationally about the contents of a glowing screen in front of them. I have literally turned people's chairs around, read them the exact contents of the screen, and gotten understanding and a rational response when they were otherwise completely paralyzed.


There is a difference between being intimidated by technology and at least being able to operatre them on a basic level. Digital natives are overall just people who are not intimidated by technology, like their parents and grandparents were.


I share the sentiment but what the article explains is more shocking than basic technical illiteracy. There’s essentially a subgroup of young people (typically poorer and less educated) who do not believe the news but believe their friends and all the rubbish that comes off their social feed.

I don’t want to sound too alarmist but that really scares me. If I believed all the stuff from my social feed I’d be drinking bleach and worried about the reptilian Royal family is after me because I know the earth is flat.


The problem you run into is that every time an expert or mainstream opinion is wrong, those social media feeds gain legitimacy.

You would think this puts a huge onus and pressure on mainstream media and experts to be measured, fact checked, and accountable right? Nope! Publish first, then when you've done your damage add a little end note saying that you've editing the article a week later so you can pretend you're a "journalist" instead of a tabloid blogger.


This is a minor nitpick, but people freshly out of college (generally) aren’t millennials anymore. Thy youngest of that group will turn 24 this year while the oldest will be 40.


I see this a lot at work unfortunately and this is for major companies its seems that the idea of having specs or requirements in a properly versioned documents is something completely alien.


Counterpoint, as a millennial, I learned all those things (albeit not WordPerfect) and more in high school.


Us millenials aren't the new generation anymore, we're old now.

Now we get to complain about all these damn Gen Zeds (everyone 95 and after).

Also you must not be Canadian because our govt had a huge hardon for Corel even post MS Office. Lots of WordPerfect installs on school PCs when I was growing up (I'm 93).


What’s thumb drive - is that the slot a floppy disk goes in?


> But our social and media users are a group marked by narrow and limited digital media use and a lack of data literacy. They are likely to come from some of the poorest households in the country.

I see this when I open Youtube in a private Window. My feed is full of videos about movies, software engineering, history, ... I just do not get what the most popular videos are about.

With TV there was a similar drift toward low quality content. But, there was a limit how low it could go and at prime time they needed to cater to everybody, including the middle class.

With Youtube and social media, you can live in your own bubble of low quality content. Facebook is just the same but much worse.

Many young people are as close to be digital natives as I am of being an airplane pilot for flying frequently.


I agree with other commenters about "low quality content" being subjective. However, on a couple subreddits I will sometimes skim I often see tons of people posting links to videos that are just someone's take on a current event or just summarizing it. I have largely given up clicking on those because most of them can be summarized in 2-3 sentences much more quickly than watching some video and yet, on the times I actually do click on one of those links the video will have tens of thousands of views. Who is watching this stuff? I understand that the creators are trying to make a living but I don't understand why people watch the videos when they could learn the same amount of information much more quickly by reading it.


Yeah, there's an annoying genre of videos that's just a dude in his basement talking to his webcam for an hour. Not only is the format low information density, but you can't even extract the key bits by skimming through it.


This is all Youtube's doing. Youtube rewards channels that upload a full video every day, and for individual creators the only way to do that is to talk into a camera for 30 minutes and just upload it.


This is also 99% of “podcasts”. Except instead of 1 dude in a basement, it’s usually a few people on a conference call.


> but you can't even extract the key bits by skimming through it.

I generally avoid that format of video entirely, but using the subtitles as a searchable index helps a lot.


Ugh I know exactly what you mean! A lot of websites have been slowly morphing into "watch this 20 minute video of us explaining to you something that could be documented on 1 piece of paper"


By what measure are you determining what YouTube content low quality and what is not?

I just opened it up in incognito mode (IP located in California) and I see the first 5 as: - Eminem music video - Gordon Ramsey - Jimmy Kimmel - 4 levels of onion rings - A couple building a shipping container home

Nothing about these seems low-quality at all. Is it just because it's not that educational like software engineering and history?


Well, for Germany it's normally some rap songs (where views seem to be bought to push unknown artists), turkish soap operas and low quality circle-jerk clickbait where some Youtubers comment on what other Youtubers did. It's really a shame.

These days there's some Covid 19 coverage in there, but that only slightly makes it better.


I am not sure if incognito will help you avoid Geo and other non-cookie based personalization


I don't think there is non-cookie based personalization, just geo localization. Removing cookies makes this obvious: there isn't even a crumb of my recommended videos.

Which makes sense because a list of trending content is a bit nonsensical without a region in mind, e.g. language.


In Europe, I get: a youtuber telling you about his "battle" with some other youtuber and how he converted him "back to islam, inshallah" (on tape obviously). Both people seem to have a problem with some body parts near their crotch, judging from their gestures. The same "environment" produced a mob of dozens of people, attacking each other in Berlin in broad daylight, because one guy claimed "Don't come to Berlin, it's my territory, you, who dared to tarnish my mothers honor in your vid".

Basically it's tribalism 101 spread to thousands of people of general low education at an amazing rate. And thanks to the bubbles, we all know about, knowbody challenges this...


When I accidently went to YouTube with no history a while back it was all those out there ultra right wing (by American stdards as well ) conspiracy crap


I think computers used to be cool but are no longer perceived as such.


I don't think computers have ever really been cool, except to a niche demographic.

Prior to the ascendancy of the web, computers were the exclusive domain of nerds (who, as far as the mainstream was concerned, were very much uncool) and business people (eternally uncool.)

After the web, the web was what was cool, but computers (and their use) just became mainstream.


I think most people agree that cyberpunk is pretty cool, right? Even if you just see the movies, blade runner, ghost in a shell, johnny mnemonic, etc.


I think business people were definitely cool into he 80s. I wasnt there but the whole wolf of wall street Reaganomics thing was definitely cool even if it was destructive. And startup founders were definitely cool for a bit until we started hating tech. Rich people making money is almost always cool in American culture, whether it's startups or wall street or mob bosses


Bill Gates was quite rich and ordinary people used to talk about him a lot. He was never seen as cool in that time.

He's had a bit of a reputation increase since then, part of it is probably his philanthropy, but also nerdy people are much more accepted and celebrated these days.


Really? I think it's the opposite. It's golden era for celebrities. You have fb, igram, yt and other platforms. You no longer need to be invited to TV or star in a movie. You can trivially publish a blog. Charisma never had as many opportunities as today. Trump, Boris Johnson - many people know they're compulsive liars and still don't care. Respect for academics is probably at its lowest. Science fiction was transformed from a popular science gateway drug to a source of colorful explosions and action movies.


Maybe, I barely remember the 80s, so you might have a point. But even so, the cool business people tended to be the rich CEOs and Wall Street types (and I remember plenty of those as villains as well, think OCP from RoboCop) but still, they weren't the ones using the computers.


Keep in mind that when you open YouTube from a blank history, you're not seeing what most people see when they're on youtube. In fact, due to their personalization algorithm, I doubt that there really is such a thing as a typical YouTube user.

A quick web search suggests that there are 2 billion MAU on the platform, with 73% of US adults using it (roughly 180 million people) [0]. Opening YouTube from a private tab in the Midwestern US, I see a range of recommended videos published over the past five years, with about 1-100 million views each, most of them between 20 and fifty million views. This content is all largely inoffensive, clickbaity, and in-line with what you might expect from a publication like Buzzfeed, or viral content in the 'tens.

These videos aren't the fastest growing, in fact it given their view:age ratio they've mostly peaked, so they were likely selected to appeal to the broadest swathe of the population possible, in order to kickstart user engagement.

Even assuming this content is only seen by Americans, the most popular videos presented have only been seen by fewer than half the platform's users.

Interspersed among the cute animal videos, the movie ads, the top n lists, and late-night TV compilations, there seems to be a selection of niche content which could be used to begin sorting you onto different user buckets. Sports videos, gaming channels, indie animated shorts, youtube poops, some Spanish content, etc.

The trending tab seems even less representative of the userbase. Its engagement is in the hundreds of thousands, to maybe a couple million over the course of a few days, with notable exceptions being a few music videos, and a recent episode of JRE.

I work with a lot of technically and "high culturally" illiterate people. A few that would be truly considered lowest common denominator.

Not too many of them are clamoring to see the latest music video, or gushing over that Minions movie ad that YouTube thinks a new user might want to see. Nobody really gets on YouTube just to watch top ten cute animal videos unless they're really high.

Videogame videos? Certainly, but there's a huge variety in those. A lot of using YouTube to learn new skills, or view product reviews. Not everybody's after the same kind of high brow content in niche x. In fact, they're likely after relevant, intelligent content that you don't even consider exists assuming they're older than 17. There's just too much variety in the population for that.

But sooner or later everybody watches a movie trailer, or a music video, or a fall compilation, or a top ten list that fits their fancy. Their typical view? Not likely. But it's something everyone will probably watch. So that's what YouTube's going to show an unknown user.

If you're disheartened by what you see, don't be. Just because everybody watches schlock doesn't mean nobody's watching in-depth, intelligent content. I'd wager most are.

[0] https://www.businessofapps.com/data/youtube-statistics/

[1] https://www.reference.com/world-view/many-adults-live-usa-b8...


I teach at a community college in the USA. Maybe 10 percent of them are what I would consider computer literate. 80% can get by but the other 10% are very computer illiterate.

On reddit I will sometimes see memes about tech illiterate professors but they only think that because they don't see the rest of their classmates try to use tech in front of them.

I had typed up a list of things that I see students struggle with when they try to use computers but I don't want to make it seem like I am sitting around "look at these kids these days!" because in reality in a class or say 20 there will only be 1-2 that really struggle with computers.

-- Of course this will make it a real shit show with all the schools at all levels going to "elearning" to deal with the Coronovirus.


I teach cybersecurity for middle school kids and organize events, I'd say the same, approximately 1% is actually competent.

Though, I think one of the biggest obstacles for a regular student is their utter lack of knowledge how to actually use a keyboard, teaching that the shift key exists is new knowledge for so many. Thankfully it can be rapidly improved with very little teaching, and things like coding actually become much easier when they make fewer typos and waste less time typing.

Telling middle school teachers that instead of Word, spend four lessons at the beginning actually teaching the tools they're using, is usually received with a lot of negativity, it's somehow considered outdated to learn how to type.


totally agree with the lack of experience with keyboards. The shift thing reminded me of a student last semester who I saw working and put the caps lock on to type a short acronym like APCA or something.


I'm far from experienced with keyboards but I do that sometimes. I use shift for starting words or sentences with capitals but every now and then will use caps lock for upper case acronyms. The two extra key strokes to toggle caps lock are easier than keeping the shift key(s) down when typing, especially when typing acronyms I'm unfamiliar with. It's just a force of habit.

People toggling caps for every capital letter will make me shudder though. I've seen them even in (introductory) programming classes. I suppose in an age where most typing is done on smart phones, it makes sense that people use the software-keyboard-style toggle instead of the "old-fashioned" shift keys.

Then again, every day programmers learn about stuff like control+delete/backspace, shift+arrow keys and other keyboard navigation tricks common in almost any program these days. It's a set of tricks and tips that you need to happen to stumble across to learn about, and if the teacher doesn't know how to type properly an entire class of school children will suffer the same faith. You can easily spend your entire high school time using caps lock for capitalization without anyone even noticing or correcting you.


Huh? Both ways of typing that are completely fine. Especially since the 'correct' way of using the opposite-hand shift key is probably the worst way to type an acronym by a significant margin, since it requires so much shuffling back and forth while coordinating chorded keys.


Holding shift doesn't require any shuffling back and forth?


Using the opposite-hand shift key, when pressing letters that go back and forth between sides, requires awkward shuffling.


You use your pinkies and there's barely any movement.


Just as much movement as hitting any other key. Plus slightly more because you have to keep it held during the next key. It's really easy to do, but it's still more than double the effort of hitting a single letter, and when you're hitting many keys per second that makes a difference.

Shift, letter, unshift is great for a single capital letter, but if you're doing that several times in a row then capslock is a lot simpler. As is holding a single shift, if you're used to it.


Wait, do they not teach typing in elementary school anymore?


My teacher had a vendetta against typing (or a hard on for cursive?) so we spent I think a few days (maybe three?) learning to type for an hour each day in the library, and then the rest of the year learning cursive.

I am 25 btw, idk what might have changed, but that teacher is still there.


There was a period not too long ago where half of my friends were in the landscaping business. Volunteer work tends to introduce you to very different demographics.

That is a lot of 20- and 30-somethings who are not at all enthused about technology, informatics in particular. Which pretty much killed my enthusiasm for a side project I had in mind.


I was friends with someone who had a couple of kids who were obsessed with video games. For me as a kid, that naturally led to some sort of hacking. If something didn't work, I would try to investigate or fix it.

For them, it was playing games while constantly talking to their friends, and if not that, watching videos on YouTube of other kids playing games.

One of them, a 13-year old boy, was upset that his laptop screen didn't work. He had been complaining to his mother for months about it. I pointed out to him that his laptop had an external monitor port and that he could plug it in to something else and it worked fine. He had never investigated this possibility. I also took the laptop and cleaned it out viruses and spyware... Something like 42 different packages were removed.

Basically if something didn't work, they would just give up and say that it was "broken".


I saw a teacher somewhere say that he had noticed that kids increasingly have trouble understanding how file paths work in computers. It's something I think is very intuitive and key to use a computer, but I guess when you're only used to smartphones you rarely if ever have to think about files and folders.

This is exactly why the Raspberry Pi was created, to enable kids to get computer experience in a world full of locked down phones and tablets.


I spent a WEEK out of my computational basics curriculum teaching middle school students just about paths. Just about FILES and paths. Did this with a VPS and chromebooks SSHd in. Also, later, Raspberry Pi units.

They simply don't understand computers when they walk in my classroom door. They understand mobile app user interfaces.


There's something about a computer's file system that makes it a difficult thing for people to learn from scratch. I suspect it's something to do with how folders can be nested inside other folders to an almost infinite degree, and how nothing in the real world really behaves that way.

I tried explaining folders to my uncle once. He said "I don't need to understand all that mumbo jumbo, just tell me the steps!" What he wanted me to do was write down the folders he needed to double-click, and the order he needed to double-click them in, depending on whether he wanted to get to photos, music, or whatever. To him, that was easier than taking the time to learn what was actually happening in front of him.


> It's something I think is very intuitive

There is nothing intuitive about anything. It all depends on pre-existing knowledge how well people can connect them. Systems like paths and URLs have become out of style and hidden from the users, so most don' learn about them anymore.

> This is exactly why the Raspberry Pi was created, to enable kids to get computer experience in a world full of locked down phones and tablets.

Not really. RPI was about enhancing access to computers. Smartphones and Tables where much a of a thing back then.


I blame Apple for people not knowing about paths. An acquaintance always used windows and then switched to Mac. She usually had her stuff neatly organized, but a year later, there was zero organization any more, she was just using the finder. If search is good enough, you don't need organization. Folders and files are hierarchical categorization, you don't need that if you can access everything by search.


For security purposes and everything it's probably better to abstract away the idea of files and wall off apps like on mobile, and usability wise this is probably making tech more accessible to users to have super efficient search. They might not know how to use a file manager, but they still know how to use their computer


Yeah, it's absolutely fine for casual consumers, but similarly to cars, you're going to be really lost if something on one of those multiple layers of abstractions malfunctions.

People will have to accept paying for computer repairs like they do for car repairs. Moving everything into the cloud won't help; Google will be weird and generally has no support, so you'll need somebody to figure out why those docs aren't visible in drive.


The Finder exposes the filesystem hierarchy much like any other desktop file browser.


This link is about inequality in the U.K., but what's even more interesting to me is the massive number of people that don't use the Internet at all. Only about half of the world population is online and there are dozens of countries where only 20-30% of the population has "accessed the Internet in the last 12 months from any device, including mobile phones."

- DRC (8.62% of 81,339,988 people are online)

- Nigeria (27.68% of 190,886,311)

- Indonesia (32.29% of 263,991,379)

- Pakistan (36.18% of 220,892,340)

We truly are only at the beginning.

https://en.wikipedia.org/wiki/List_of_countries_by_number_of...


The United States has a shocking number of people who are not on the internet, as well.

The company I work for deals primarily in healthcare for the poor, and even though we have a fleet of web sites, social media accounts, and a text messaging program, we have to also send everything out on paper in order to make sure we reach as many people as possible. And since we're in healthcare, we don't get to be all SV Bubble about it and say, "Well, they should just go buy some money and get a smartphone."

My company requires that all departments have some amount of hands-on with our customers, including us button pushers, often in their homes and neighborhoods.

It's amazing the number of people I've met who have no computer, no internet connection, not even a cell phone of their own. Often a single flip phone will be shared by all the members of a family. Sometimes, a single flip phone will be shared by four or five families living next to one another.


Similar space here. Don't discount SMS as a delivery channel, it hits a surprising number of folks, including those without a fixed address.


You're right. Last I heard, we had about 90,000 people in our text messaging program.


I mean, I'm a professional software developer (among other things), and when I'm commuting in highly populated areas on regional trains, the quality of Canada's telecommunications networks, the most expensive in the world to subscribers, is so extremely poor that SMS is the most reliable way to reach me during that time. It is pure luck that my workplace isn't in a deadzone for LTE or even 3G strong enough even to refresh Hacker News.

SMS is absolutely necessary for me, I still have a basic GSM phone for phone calls and SMS because it is more reliable than current stuff and has a month of standby time.


It's also one of the cheapest channels, and having worked with transient and Medicaid populations it has become essential.


> Sometimes, a single flip phone will be shared by four or five families living next to one another.

Now that is an interesting surprise! Is that a cultural/knowledge issue, or is it poverty doing that?


Now that is an interesting surprise! Is that a cultural/knowledge issue, or is it poverty doing that?

In my estimation, it's just poverty. They know that other families have a phone for each person. They see the billboards, they see the TV shows, they have children who go to school with kids that have their own phones. But some of these people are working two full-time jobs, or even three part-time jobs, and still struggling.


I thought there was a federal program to provide phones to people on low incomes - the so-called Obamaphone (although it was actually Reagan).


There is, but there are significant barriers to entry. You sometimes see commercials for it on TV, and notice they all feature well-off 60-somethings enjoying their retirement savings and government benefits with their grandchildren. That's the target demographic for the free phone providers.

The adults I deal with have an average of a fifth-grade education. They can't fill out forms. They don't know about government programs. There is sometimes a language barrier. And to them, a phone is a precious, expensive thing and they worry that if it breaks, they'll be cut off again and owe some giant telco money that they don't have.

We have people on staff who try to help them with this sort of thing, but it's an uphill battle.


What's the point of looking at third world countries? They don't even have running water everywhere, of course not everyone there has Internet access.


The point is that designing interfaces and technology in general should not assume that everyone is a "digital native." It's only a matter of time until the entire world goes online and with that comes a lot of startup opportunity.


Sure, but you're not designing interfaces and technology for those that aren't online, are you? Like, you're not making an app for rural Pakistanis that can't read and don't have smartphones.

The whole world might be online at some point, but you'll still not design your apps for Pakistan, because the difference in adoption speed of technologies, trends, cultures etc will not go away just because they are technically online now.

That's not to say that Pakistan itself won't be a strong market to target. But It's a niche market, you'll likely not target the US, Pakistan, and France in one app, just like you don't today.


You'd be surprised how many people in third world countries have internet and cell phones but no running water or electricity.


No, I'm not. Having internet and cellphones enables advertising and location tracking. Running water and electricity doesn't.


Something like 99.9% of the people from 3rd world countries who currently don't use the internet would end up being lurkers even if they did get connected. So no, nothing would really change apart from ad revenue and view counts perhaps.


I don’t understand how this is surprising. We have reached a point where devices and software are working well enough to not worry about how they work. The same happened with cars and probably a lot of appliances. Some decades ago you needed to know how they work so you could repair them. Nowadays not many people have even the foggiest idea how a car works. They just use it.


I've seen HNers assert that everyone should learn to code or know how a computer works which I think is just a level of hubris because they themselves happen to care about those things. Or confuse it with something particularly positive about themselves.

Meanwhile, there are plenty of things those same people don't care about knowing, like how an engine might work, how their girlfriend of five years puts on makeup despite her doing it daily, how electrical/water/gas works in their house, the history of civilization, how their government works, etc.

Life is a crap sandwich in so many ways that I don't think these expectations are fair until we're immortal with infinite leisure time.


Disagree with this. As a 26 year software developer, it's been fascinating to see just how many of my peers in non-software industries are now interested in learning to code because it's useful for their jobs. Coding allows you to talk to machines, and machines are everywhere in modern society.

Here in the UK, basic coding is now taught in schools (from primary school!). And I think this makes sense. Not everybody's going to be an expert. But the basics are incredibly useful, just as with math, science, history, etc.

Note: most of the other things you list as things that people don't care about knowing are also taught to a basic level in schools.


> it's been fascinating to see just how many of my peers in non-software industries are now interested in learning to code because it's useful for their jobs.

Agreed here. One of my co-workers started a coding club at lunch, for people in other departments who don't know how to code. For several of them it's changed from black magic to an amazing productivity enhancer.


I’m of the opinion that everyone should learn to code, but not because the coding is universally useful. Rather, it’s because learning to code is the best way I know to learn to be okay with failure. Shit breaks, you try again.


I think just living teaches you that lesson.


You learn that elementary life lesson doing literally anything in life.


Classic blog post from 2013 on this subject.

http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-co...


Whether someone is a “digital native” or not has everything to do with whether they’re 1) interested in creating something, and 2) what they’re interested in creating.

I may think they’re really missing out on everything that a computer/the internet can do, but there are plenty of people really into cars that feel the same way about me driving a 23 year old vehicle.


The thing I've noticed about the younger generation is that they are better at consuming tech intuitively, but mostly have no idea what makes it work. They seem to be able to use technology almost instinctively, but if they have to troubleshoot or create anything they don't know where to start.

You have to design UI for 'digital natives' to be just as user friendly as you would for baby boomers that didn't see their first computer until after they were grown. Both of them would be equally lost if you plopped them down at a terminal with nothing but a blinking cursor. This is generalizing, of course, there are exceptions.

Anecdotally the only thing that separates a 9 year old and a 60 year old is that the 60 year old won't want to touch a computer because they are afraid they will break something and the 9 year old will mash away until they get where they want to go. Neither knows or cares how or why it works.


This has been my experience. I have kids from 6 to 23. They all come running to me anytime they have a technical issue. I’ve tried to teach them troubleshooting steps or how to search for help, but they aren’t interested. As far as they are concerned it should just work and if it doesn’t it’s someone else’s job to figure out why. Their friends all appear to be the same. I know of only one teenager who has a glimmer of interest in understanding what is happening behind the screen.


How could they understand how things work? The technology, the computer itself is hidden behind user experiences.

http://contemporary-home-computing.org/RUE/

> This approach leads to some great products on screen and IRL, but alienates as well.

> Robotics doesn’t give us a chance to fall in love with the computer if it is not anthropomorphic. Experience design prevents from thinking and valuing computers as computers, and interfaces as interfaces.

> It makes us helpless. We lose an ability to narrate ourselves and—going to a more pragmatic level—we are not able to use personal computers anymore.

> Every victory of experience design: a new product “telling the story,” or an interface meeting the “exact needs of the customer, without fuss or bother” widens the gap in between a person and a personal computer.

> The morning after “experience design:” interface-less, desposible hardware, personal hard disc shredders, primitive customization via mechanical means, rewiring, reassembling, making holes into hard disks, in order to to delete, to logout, to “view offline.”


I was talking to some of my fellow Gen-X friends and we are jokingly complaining that kids these days don't even know how to pirate things. If it isn't on a streaming service they don't know how to access it.


TV and radio was the same way. My grandfather knew how to replace tubes and test circuits. I knew about adjusting antennas.

My son’s technical knowledge of TV is charging the remote, rebooting the router and reseating connectors.


That actually seems pretty good, especially since if you aren't using an over the air antenna there isn't much else you can do with modern flat screen TVs.


I agree! As things get mature, there’s no need to tinker.


This is mostly my experience. I'm 18 and a lot of the people that go to my school only know how to use social media, and come to me when they need tech help. My closer friends are more tech literate, but they still come running to me when they need help with something more complicated. I supposed I should be happy that they even have interest in getting help. Most would just give up.


As someone who grew up tinkering with computers I can relate but then I remember the perception of my elders towards my generation. I remember all the things I couldn’t do or didn’t pick up on that my family (and extended relatives on the homestead) could do. They could so easily fix and build everything and anything around the house and farm. Everyone was just so incredibly handy.


And not all slightly older people meet the expectations of tech use for that slightly older generation. I think it's not uncommon for HN, but I have a minimal social media presence, and am rarely an early adopter-- more like a stage-2 adopter once something is proven, and very much not on board with social media adding anything positive to my life. So-called digital natives are, I think, becoming increasingly skeptical as well, in a grass-roots phase of limiting contact to closer, well-known friends/acquaintances. Not yet a majority, no, but the beginning of what has the potential to be a trend.


My siblings are 10+ years younger than me, and have very little technical savvy, despite holding advanced hard science degrees, whereas such things are of my profession. I think part of the difference was growing up in an era where I could break things easily, and would have to understand and investigate them in order to fix them. More modernly, this sort of understanding isn't needed, so unless you have a burning curiosity, there isn't an ambient technical nourishment that is required to imbibe. No eLan Vital, as it were.


In recent days I've been starting to feel this idea of "digital natives" is a silly one. There's no such thing as a "reading native" or "math native" – these are skills and concepts we learn in school because society at large has decided they are important enough to warrant the investment.

On the other hand, there's no standard curriculum or expectation around computers and software (afaik). We have this expectation that people will learn how to use things by osmosis or will be taught by the thing itself, perhaps because some kinds of software are extremely good at this (primarily games) and some types are easy enough to get started and have a motivation (eg. social) that they can "pull" users through their difficulties (or they can ask friends). People jump to conclusions - "if the kids can do that, they can do everything" - which is clearly not the case. My knowledge of Excel won't help me with Snapchat which won't help me with Blender - and the difficulty difference between commonly used consumer apps and work apps can be significant.

The issue is that outside of the easy spaces, learning about computers is boring, difficult, doesn't have obvious usefulness, provides largely useless feedback, is often wildly unintuitive compared to how things work in the real world, and tends to require a baseline of knowledge most people won't have the interest to develop. This isn't surprising - the same applies to many subjects we learn in school. Most people will have no more interest learning common computer abstractions and mechanisms than memorizing their times tables or writing essays with an opening, body, and conclusion or looking up words they don't know in a dictionary.

In short, computing is something that should be actively taught in schools. Typing is a basic skill that should probably be taught in elementary, when children are learning to write (or shortly after). UI abstractions like files, folders, windows, searching should probably follow. Word processing should be covered before middle school, other productivity software during, along with more complex abstractions (clipboard, storage, paths, permissions, networks, accounts/passwords/security, scams, where to get help). Introductory programming should be taught as well.

None of this is to say that the average student should be a capable programmer after completing K-12. But I think that almost everyone should know enough to draw reasonable conclusions about systems and problems they encounter without outside help, if for no other reason than to avoid being exploited by scams, ransomware, ads, and whatever new attack vectors show up in the coming years.



the tech community (i.e. us) celebrated the mobile "revolution" and simple (i.e. dumbed down) UIs for more than a decade. You reap what you sow though, you have kids growing up not knowing what a file is. This is no testament to the success of the mobile and dumb-down revolution either: people use their phones more, but the only action they know is scroll and like, so it s not like the UX empowered people, it just steered all their energy to consuming. There is less and less features everywhere, and we are now celebrating new emojis. We are essentially creating a new inequality with the "dumbit down" revolution which pushes people to walled gardens and single-function apps to avoid the pain of learning elementary configuration.

In more concrete terms, Apple is probably the first preacher of hiding complexity behind a mask, and then almost everyone followed, even those that shouldnt (e.g. linux, windows). Gone are the logical, consistent , hierarchical UIs like windows 95, now windows doesnt even have a start menu and you need a search bar to navigate the control panel. Of course people will never learn about a directory tree if they never see one, but those concepts are key and undestanding them simplifies the understanding of just about anything.


My kids were unfortunately indoctrinated by chromebooks and iPads in our public school system. Using real computers to do things is way too complicated and frustrating for them. Most kids are just like them - they're great using very limited technology for very specific things. The rest of computing is a mystery to them. They type on keyboards at less than 30WPM. At their age, I was doing 50 or so - a requirement for the typing class to 'meet expectations'.


Well clearly. Almost everyone alive today is an "electricity native", but it's only a very small minority that can safely install wiring and the like. Everyone else just interacts with the consumerized version of it where everything interesting but potentially dangerous has been abstracted away. There is no reason to believe that younger people are more savvy about the workings of digital technology just because they grew up with consumerized versions of it.


It's worse today than it was in the 90s and early 2000s. Back then the web was an indie experience. If you wanted your own blog, you had to host it. Or if there was a platform you could use, it let you customize the HTML and CSS. An entire cottage industry of websites hosting code snippets sprung up, and all the kids were using HTML and tweaking it.

If you wanted a video game forum, you needed to host it. You had to learn how Linux servers worked, what MySQL was and how tables and migrations functioned, and might even need to write a little PHP on the side.

When I was a teenager, I had a PHP website that did game matchmaking, lightweight article publishing, and custom (non-phpBB) forums. I put this under Subversion source control and let several of the members that were code literate contribute. We built a lot of stuff and had a whole thing going. Multiple websites. And then we stood up an IRC server, which gave way to bots and log ingestion and archival.

We ran a MediaWiki instance (http://strategywiki.org), we knew the folks that wrote the Gamecube LAN adapter tunnel.

When it came time for college, I was way ahead of the curve. I blew through all of the upper level courses before my electives, and I was a TA in my freshman year. (Second semester!) This let me take a lot more biology and chemistry classes (something I wasn't previously skilled with, but had a deep interest in learning).

This isn't as widespread anymore since there are platforms that do everything. Even though the learning curve has dropped and the barrier to entry is lower than ever, the barriers of interest and necessity have been raised.

Kids don't need to hack around as much, and it's sad.


I think you are overestimating the total number of kids that did the things you did.

A higher percentage of kids that were on the internet had the skills you are talking about, but they weren't a higher percentage of the overall population.

For example, there might have been 5% of young teens doing what you were doing, and that might have made up half of the internet users... now, it is still 5% of the total population, but that is now only 1/18th of the total internet population of that age range.


Seriously. I think their comment is good example of patting themselves on the back for their interests while being completely wrong about reality.

I happened to run my own phpbb/vbulletin forums as a kid and I can confidently say I was the only kid even close to doing that in my high school. I also didn't need to know anything to do it except press buttons on CPanel and phpmyadmin, things I didn't really understand. And I was scared to tell other guys at school about this stuff because I'd be an instant loser. Even in uni I rarely met anyone with those interests.

Meanwhile, these days, it's completely cool to spin up your own Discord server for your friends. It's totally mainstream to be into that sort of thing. Even gaming is cool now. And there's more kids these days doing more of what the grandparent post glamorizes because it's more accessible than ever.

I'd be surprised if grandparent commenter was even working off any real experience with modern kids and instead just wanted to glamorize their own nostalgia. I mean, surely modern kids aren't as cool as he was.


> surely modern kids aren't as cool as he was.

Surely.

I don't get why you folks have to be so rude. It diminishes your argument. I was with you until you had to poke fun at me. I'm a human being capable of making mistakes and reevaluating my ground truths. Why be like that?

> patting themselves on the back

While there's a lot of nostalgia in my anecdote, it's mostly spurred on by my dislike of platformization and centralization. I think they produce many negative externalities.

But thanks, now I'll clam up and stop talking about my personal experience.


I think the more modern equivalent would be making games with various javascript game engines (twine, etc) in a web browser or similar and making art with medibang paint and so on.

Then there is the entire linus tech tips make a cool RGB flashy gamer computer and poke at shit, where those kind of tech youtubers make get a million views per video.

Or buy a rasberry pi at 13 and poke at linux on the internet.

You have to realize, that even back then, only %0.5 of a school population is going to be into computers enough to install linux on anything. At my small high school, I was the only one basically.


I initially sympathized with this line of thinking, but someone recently convinced me otherwise. The logic is as follows:

* Do programmers still fool around with Cobol or Assembly or Fortran? Not so much

* How important is C nowadays? Yes - everything runs on it - but that is because for the most part it is rock solid, and we need fewer and fewer C developers

* The set of "primitives" required has moved up a layer of abstraction. New skills include: website builders, data pipelines, infra-as-code, cloud infra, web frameworks, etc.

If people are still hacking the same way as they were 20 and 30 years ago, that is a problem. Yes, the hacking today surely looks different than it did decades ago - the primitives are larger and higher leverage - but there is hacking all the same.


It's also assumed that anyone not using the "low barrier to entry" platforms is a professional (or else, trying to make a professional product), which, ironically, raises the barrier to entry for anyone trying to get out of the sandbox.


> If you wanted a video game forum, you needed to host it.

InvisionFree started in 2002 and was incredibly, bonkers-popular.


That's a great point and something I've never thought about.

I think it's something that would be helpful to think about when I'm raising my kids.


I'm under 35 and consider myself very 'digitally native'. It wasn't until I helped my 19 year old niece set up her laptop that I realized exactly the same thing. They're fine with youtube and spotify from a tablet or phone. She could pick things up fairly quickly for school assignments (Office, etc.). I think it was her hunt-and-peck typing that made me realize my assumptions were off. (I don't think typing classes in middle and high school are much of a thing anymore in my region.)


Watching professionals hunt-and-peck all day drives me mad.

I took a 2 week typing class in the summer after 8th grade. We learned on big heavy mechanical typewriters where you really had to hammer the key to get an impression. It was time well spent, as it has paid off for me continuously ever since. For example, I wrote this entirely with touch typing (not looking at the keyboard).


I have occasionally thought to time to learn proper touch typing. But I never seem to get over the idea that my bottleneck is not in the fingers, but in the brain trying to figure what to type in the first place... Can someone prove me wrong and make me motivated to learn?


It's the same as any other higher-level skill: the faster you can perform the lower-level functions, the faster you can iterate.

Your medium of expression also influences the speed of your thought. You'll notice that people who speak for a living seem to be wittier than those who don't; they're able to form thoughts and translate them into expression very quickly. Typing/writing is already much slower than speaking, but its advantage is the ability to revise a thought before conveying it. If you're typing slowly, you're probably forming thoughts slower than someone who types quickly. Anecdotally, spending so much of my time communicating via keyboard stunted the development of my ability to express vocally, because it lowered the rate at which I had to generate and, internally revise, and expel thoughts. It would be even worse if I could not touch type.

That's my layman's take, anyway.


I'll assume you're a programmer. Have you ever been in a situation where your IDE, text editor, or other tools were extremely slow and unresponsive? You're trying to perform some task and there's this external bottleneck slowing down everything you are doing. Instead of completing the subtask quickly so that your train of thought flows smoothly into the next unit, the flow is interrupted by having to wait. And once you're out of the flow, there's mental effort required to get back into it. But if your tools present a constant bottleneck, then you are permanently stuck in this less efficient mental state. Essentially the bottlenecking tool becomes a part of your mental model, and that useless chunk of mental real-estate reduces what can be accomplished by the same mind unburdened by this useless detail.

But all of this goes just as much when a lack of typing skills is the constant bottleneck. If you can't type at the speed of thought, the quality and complexity of your thoughts suffer since a part of your working memory is consumed by the process of typing. This has real implications for what you're able to accomplish.


Touch typing really pays off when you're typing in things like notes. It literally doubles the speed because you can do it in one operation rather than two.

The touchscreens of today make touch typing impossible.

Also, with my laptop and its cramped, wretched keyboard, I use a full size bluetooth one (they are surprisingly hard to find) to touchtype on. I hate how every keyboard maker has to "innovate" by moving all the non-Qwerty keys around randomly.


This is what infuriates me so much about the new Mac portables...they clearly were designed by hunt-and-peck peons. Software "keys" that you have to look down to reliably hit, word suggestions (implying you should looking down at the keyboard instead of the screen, which is flat out wrong...) and scrolling emoji. Never mind the key failures...


I'm surprised you say this. I have no problem touch typing with one, or two thumbs on almost any phone displaying a qwerty keyboard after orienting myself to it.

I'm wondering if you have very specific muscle memory outputs for your fingers vs. an intuition of where each key is relative to each other. E.g., I'm typing this sentence with only my pointer and middle finger on my right hand. Would that cause a problem for you?


Find an alternative perspective like typing games.


I saw a lot of kids using caps lock to type capital letters, that is even more painful. Thankfully they quickly utilized the new knowledge about shift key, I like to think I changed the way they use computers, but who knows if they actually have the motivation to learn further.


Is this in the US? I thought absolutely everyone was taking typing.


If anything, there's kinda a middle ground age right now where you can expect greater general understanding of computing concepts, with both people young and older than that being more ignorant.


Yep. When I was 2 my parents got an old computer from my uncle's business that he was replacing and set it up in their living room for me. I was learning the command line as a toddler. I'm certainly no guru, but that kind of formative experience was certainly a big part of my choosing this career path. Kids of today grow up with ipads and iphones and have no cause to go poking around and experimenting with the basic functionality of their computers.


Same here, some of my earliest memories are of an old ms-dos mickey mouse game, where you were required to look through a booklet and enter in codes to perform actions.


Learned how to spell quite young to use a CLI, first word was "DOS" lol.


I saw a report one day, where they interviewed many young people (<25) about their tech usage.

Most only used a smartphone and many of them only knew about well known apps like FB, IG, SC and WA, etc.

They were "natives" in their apps, but most of them didn't know much besides that.

Which is probably what you would expect and its probably still better than the mindless TV junkies from before.


“Data thinking”, “data doing” and “data participation” struck me as an unintuitive way to categorise digital literacy, especially as the example for “doing” seems to fit more squarely under critical thinking: “being able to identify and highlight the source of information others share”.


The sooner we dispell the notion that all people are equally capable at everything, the sooner we can get back to prioritizing merit and see it return to the country. It will take decades. The virus has exposed the incompetence that pervades the nation.

Articles like this are only surprising because some two generations of propaganda has convinced young Americans that everyone deserves a trophy and we can all be Einsteins and Armstrongs if we just set our minds to it.

But this is moreso a structural, institutional issue. This mindset has resulted in the conflation of equality of outcome with equality of opportunity as we take for granted that everyone is equally capable in the west. So now instead we throw money and man hours and bad policy at our lowest achievers in a sort of crabs in a bucket mentality, I stead of prioritizing resources for those who are more likely to benefit society in broad strokes - scientists who cure disease, for example - and as a result our native students are being thoroughly outcompeted by other nations.

This has resulted in a net reduction of the standard of living for all Americans. And is probably one reason for inequality and general unhappiness.


Articles like this are only surprising because some two generations of propaganda has convinced young Americans that everyone deserves a trophy and we can all be Einsteins and Armstrongs if we just set our minds to it.

The article is about how economic class and educational inequality prevents everyone from having meaningful technical experiences that lead to technical literacy. I think you're missing the point by a mile or so.


>The article is about how economic class and educational inequality prevents everyone from having meaningful technical experiences that lead to technical literacy

No. That's exactly what I'm arguing is wrong. We're putting the cart before the horse. The internet is ubiquitous, even if you're poor at this point - it's in your schools, it's in free libraries, and entry level smartphones are cheap enough that all but the poorest can afford them if it's a priority.

The problem is none of that. It's that not everyone is technically competent - and no amount of education or money will fix that. Intelligence is a high dimensional spectrum and technical details regarding computers are out of reach for a sizable proportion of the population. This is exacerbated by poverty and/or poor education, but those are at most half of the problem.


> We're putting the cart before the horse.

If you want to accurately act on 'inherent intelligence' you need to correct for external factors first, or else you'll just get a slice of the bell curve based on how rich/poor people are.


I'm not arguing against that. What I'm saying is that we have been drastically overcorrecting for too long because we refuse to collectively acknowledge the reality of intelligence as a distribution and allocate resources that maximally benefit society.


> drastically overcorrecting

If anything, we've been undercorrecting. A simple example would be the many studies that show that IQ in children measureably increases in the years following a move from a poor area to a rich area, even if the child's family's income doesn't change.


Most school Internet domain-blocks Wikipedia, libraries are not in walking distance for substantial amounts of people (and that's leaving out how library fees are prohibitive for low-income families), and entry-level smartphones are locked down to the point of making curiosity irrelevant.

When you look at who in this industry succeeds, you very rarely see people from low-income populations. There isn't substantial reason to believe that distribution of opportunity isn't the handicap for a significant amount of people.

YC: founded by the son of a nuclear physicist, and the son of one of the UNIX authors.

Microsoft: founded by the son of a lawyer whose firm is currently making a billion a year in revenue and was similarly doing great at the time.

Facebook: founded by the son of a shrink and a dentist operating in one of the richest areas of New York, rich enough to be sent to private school.

Apple: founded by the son of an engineer at Lockheed.

Amazon: founded by the son of an engineer at Exxon.

Netflix: founded by someone whose family was rich enough to extravagantly donate during the Great Depression.

Google: founded by son of someone with a PhD in Computer Science (one who had multiple thousand-dollar personal computers filling their house since said son was born at that), and the son of someone with a PhD in Computer Science and a researcher at NASA.

C: written by the son of someone...who worked at Bell Labs.

Rob Pike gave a great example once of how Computer Science is nothing if not under a caste system, although by accident (explaining why one of the early UNIX usernames was 'sjb'):

https://commandcenter.blogspot.com/2020/01/unix-quiz-answers...

8. Q: Adam Buchsbaum's original login was sjb. Who is sjb? A: sol & buchsbaum Adam was one of many Unix room kids with parents who worked at the Bell Labs. Adam was unusual because his father was executive vice president but apparently didn't have enough clout to get his kid his own login.

Women used to be the majority of computer programmers. Where are they now? The strongest counterexamples to the general rule of "People with rich parents that had connections" were all significantly before computing became a prestigious field, and even they didn't get all that far (I don't see McCarthy's contributions all around the field despite largely being significantly better than alternatives offered by people like Backus and Ritchie; what's the difference between the one and the other two? McCarthy was the son of communist immigrants, Backus was the son of a stockbroker, and Ritchie was the son of a Bell Labs employee).

If you look at the history of this field, you see nothing but people with all of the opportunities in the world getting all of the success from it. People from affluent families becoming more affluent through upbringings that involved science heavily. "Meritocracy" isn't "The best of the affluent children win," it's "The best win." Computer Science does not resemble meritocracy in any fashion, and largely never has.

The only general counterexample out of all of the FAANG founders, Jobs (the son of lower-middle class parents with no education to speak of), has been mocked for years for being comparatively non-technical. Notably, he strongly believed that the problem was an access problem, not some magic hand-waving "Some people are better than others!" nonsense. His vision of what libraries should be like today was far different than yours[1], but because your mindset permeates among the people who make these decisions, more or less wasted.

[1] See interview starting at 0:34 in this video: https://www.youtube.com/watch?v=EA1nQQGw2O4


> and that's leaving out how library fees are prohibitive for low-income families

Setting aside your other points - completely free with no fees is prohibitive?


That's not in all areas. The closest library to the town I grew up in charged fees for a card (required for both checking out books & computer access), as an example.

Further, there are fees almost everywhere else, too: overdue fees. Unsurprisingly, when you get rid of overdue fees, participation rises:

Since the fine-free policies went into effect, the library has seen an increase of 29,094 patrons year over year. In addition, 3,900 of the 6,500 patrons who were prevented from borrowing items due to overdue fines have returned to use library resources.

https://www.kitsapsun.com/story/news/2019/07/24/kitsap-regio...

The changes were enacted after a city study revealed that nearly half of the library's patrons whose accounts were blocked as a result of late fees lived in two of the city's poorest neighborhoods. "I never realized it impacted them to that extent," said Misty Jones, the city's library director.

https://www.npr.org/2019/11/30/781374759/we-wanted-our-patro...

It's not controversial that overdue fees are a tax on the poor:

https://www.cde.state.co.us/cdelib/removingbarrierstoaccess

Even the ALA agrees that libraries are discriminating against the poor with them:

http://www.ala.org/aboutala/sites/ala.org.aboutala/files/con...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: