Honestly I think it just means that they're smart. My girlfriend in college didn't remotely fit any tech stereotypes, but when she saw how much more functional my Linux laptop was than everyone else's, she spent a day and set up Linux for herself. We're still friends, and ten years later she's still happily using desktop Linux and grumbles to.me on the rare occasion she needs to run a windows VM for some specialty software.
The idea that being tech savvy requires lots of hobbyist time investment and a deep interest in the topic is insane, like saying that being a good driver requires being independently interested in cars as a hobby.
To a large degree, I think that Apple's marketing and the rest of the industry following (cf Google's shift in product design and focus six years ago) shoulders a big chunk of the blame. Endemic short term thinking in the form of only considering how easy to use something is in the first ten seconds has led to an entire generation of computer users quarantined into a tiny fraction of what their devices can do and how well they can fit their needs. (For example, my dad is _also_ on Linux, and he has barely managed to figure out copy and paste. Windows is simply too buggy and difficult an OS for users as novice as he is)
You're still describing just a portion of the population, at best. Yes, your point about the myths is fair. But even with those myths dispelled, there's going to be at least XX% of the population that struggles with or is unwilling to put in that work.
Think about other super important areas of life like finances and health. Perfectly capable, developed, functional adults still routinely neglect these things because they don't want to 'deal with' the mental overhead of 'figuring it out'.
As long as this population remains, there will be a market for solutions that offload some of the 'work'.
> Think about other super important areas of life like finances and health. Perfectly capable, developed, functional adults still routinely neglect these things because they don't want to 'deal with' the mental overhead of 'figuring it out'.
Not sure these are great examples. There is a LOT of good self-learning information out there about computers and tech. One could not know anything about Linux, study online references, experiment a little, and become competent. And you can always find a Linux nerd to give you advice and help. Contrast:
(Personal) Finance:
Most investing information on the net will not produce any kind of outsized returns, and quite a bit of it are scams and plain bad advice. You're not going to become rich by following r/PersonalFinance. And unlike tech, where a bad experiment means you need to wipe the hard drive and try again, a bad investing experiment can lose you money. And all those financial advisors out there? LOL, they're crooked as a bag of snakes and are primarily out to milk you.
Health and fitness:
Health is even more of a shit show. Good luck finding any consistent reliable information through self-study. This good for you. Now it's bad for you. This causes cancer. Now we know it doesn't. Now it does again. Every month there's a new diet fad. Online, quackery is the rule rather than the exception. Top 100 search results are going to be weight loss spam. And your doctor? LOL, he just wants to charge you for repeated visits and sell you whatever pill he's being incentivized to push this month.
I don't think it's about people not wanting to deal with the mental overhead of figuring it out. It's really a challenge to figure out and sort good from bad information.
Honestly I just think it means that this stuff really isn't all that difficult. I mean, sure, the first person that figured it out and posted a "how to" blog article is usually pretty smart (or at least a domain expert) but it really doesn't take much intelligence to follow the steps in an article posted online.
I think many of us overestimate our own intelligence just because we have worked with computers our whole lives. The rest of the world caught up with us circa 2010 and technical proficiency (and even basic development skills) are table stakes. Hell, they teach machine learning in high school these days.
Everyone is tech savvy these days. It's no longer a differentiator unless you gain some specialized knowledge and keep it current.
> Honestly I just think it means that this stuff really isn't all that difficult.
I mean, I agree. I don't personally think this stuff is difficult; "smart" in this case can be in opposition to either "too irrational to spend the time on high ROI computer literacy" or "unintelligent enough that the ROI _isn't_ worth it". I really don't think that the latter is true of most people, but that's the excuse people often give for making intentionally uninformed personal computing decisions.
> I think many of us overestimate our own intelligence just because we have worked with computers our whole lives.
Yea, this is definitely not what I'm saying. Frankly, I don't think you have to be all that smart to be an engineer, let alone computer literate.
The idea that being tech savvy requires lots of hobbyist time investment and a deep interest in the topic is insane, like saying that being a good driver requires being independently interested in cars as a hobby.
To a large degree, I think that Apple's marketing and the rest of the industry following (cf Google's shift in product design and focus six years ago) shoulders a big chunk of the blame. Endemic short term thinking in the form of only considering how easy to use something is in the first ten seconds has led to an entire generation of computer users quarantined into a tiny fraction of what their devices can do and how well they can fit their needs. (For example, my dad is _also_ on Linux, and he has barely managed to figure out copy and paste. Windows is simply too buggy and difficult an OS for users as novice as he is)