More free time through computers?
A good colleague of many years showed me this drawing the other day, with which he had advertised his very successful computer distribution business in the 1980s. Of course, the caricature has the charm of the 80s and already seems very old-fashioned. The value promise in it, however, immediately aroused associations with current discussions in my mind.
- The computer conquers our leisure time While in the 80s the computer was seen at most as an aid in everyday business and probably only computer nerds saw more in it (gaming and programming on C64 and Atari ST were my topics), today people can’t imagine their leisure time without smart phones, tablets or PCs. It is no longer “more leisure time through computers”, but “more computers in leisure time”. Whereas the computer used to be our “white knight” in the daily battle with files in the office, today we have to be careful that it doesn’t have us in its grip as a “data octopus” in our free time.
- Digitalisation makes human presence superfluous This thesis can be seen as a promise of salvation (we all don’t have to work so much any more and still get a great salary) or as a threat (the computer makes us all unemployed and humans superfluous). I consider both views, which are unfortunately often used in a simplistic way in the media, to be naïve and inaccurate. Even if we use such avant-garde terms as artificial intelligence, learning machines and IoT (Internet of Things) today, they are and remain technologies used by people for people. Of course, our activity profile will change in the future, as it has done in the past. Completely new and today still unimaginable jobs will emerge, which will also demand new skills from us. But we are still needed as workers and consumers in our societies. Only the change will happen faster and more drastically.
In any case, you can guess from the cartoon that something has happened in the world of information technology. And as Moore’s Law has already fairly accurately predicted, the number of transistors in a microprocessor has increased from around 100,000 in 1980 to over 12,000,000,000 in current Intel CPUs. It’s hard to imagine what will be available to us in another 40 years.