Back in the day, a day which I have no personal experience of, when personal computers were first starting to enter people's homes and businesses, things were more primitive. Computers were both more and less general purpose. What do I mean by this? Lower level programs and the ability to write programs were easier to access, yet pieces of software written for users were more specialised. You still get this in smaller applications, that you may come across on the internet, they serve one specific purpose - but more popular software by companies like Microsoft and Adobe has become more general purpose; Powepoint for example is Turing-complete.
What does this mean for users? It is my view that these large software packages with many features have scared users, so much so that they do not want to learn to use the software to its full potential. They think "software is just like that" - it's a black box, impenetrable. This discourages them from exploring it. Users are afeared to make mistakes, lest they lose their data. It is a damn shame.
This only seems to affect those of a certain age-group, which I find curious. It is those who grew up shortly after these computers became ubiquitous. Younger folk are not afraid to make such mistakes, they know that things can be undone, they know to make backups when necessary (sometimes from past mistakes). Older, tech savvy folk know that it is just signals and transistors; they have some inherent understanding of what a computer is doing.
What should we - as authors of software - do about this? Stop dumbing down our designs! There are other options, but this is my preferred option. We must demystify the clouded black box, and reduce the levels of abstraction our software creates between the user and the electrons. But not so much that it becomes obtuse. Okay then, how shall we do this? For one, we should be more descriptive in the names of buttons and menu options. If a description is too long, it should be in a tooltip. We should give users fewer options, not more! If a set task is possible to accomplish in multiple ways, you might think this a good thing, but I say it is redundant. If a user is doing it often, then they should have the option to automate it by recording their actions and putting it in a macro. Users should be made to think about what they are doing, so that their actions are more deliberate and meaningful. Once they are in this mindset, any piece of software should be easy for them to master - except for those vastly overcomplicated software packages.
For the most part, users of MS Word could achieve their goals in WordPad, and some of them may even be able to achieve their goals in Notepad. Nowadays, Word is the default, and there is a shift in its demographic as people age. I worry that the following generations may not be able to use a computer as well as the people of today as software is continually dumbed down. A part of me thinks it could be for the best if it is more accessible, but we are training these people to expect the computer to do everything for them, and do it correctly. Most businesses could make do with a simple spreadsheet and a mechanism to back it up, instead of a database.
...
You might say you can't teach an old dog new tricks, but I really do believe the old design of Word was better; and this goes for all Microsoft software - menus were better than the "ribbons" we have today. What even is a ribbon? It's something you use for decoration! What's a menu? Something you choose an item from! I despair!
I would like to leave you with this video from Tomorrow's World: www.bbc.co.uk/archive/tomorrows-world--home-computer-terminal/zkknqp3
It is not the one I remember, where an elderly woman used a computer to keep track of her sales, but it demonstrates my point well nevertheless.