Skip to main content

Hitachi

Hitachi Research Institute

President Column

Column by the President of Hitachi Research Institute, Mizoguchi

#12:Digital Technology and Programs

I first came to know Microsoft's name about 40 years ago. When I turned on a tabletop Japanese microcomputer in my school's physics laboratory, the message "Copyright 1979 (c) by Microsoft" appeared at the top of the screen. At the time, Microsoft provided software for the programming language BASIC used in microcomputers. The Apple microcomputer Apple II, which I dreamed of having, was also installed with BASIC. If I turned it on, Microsoft's name would probably have appeared on the screen in the same way. But the Apple II was so expensive, and I had no choice but to look at it through a show window in Akihabara Electric Town.

Strictly speaking, Microsoft provided an interpreter, a piece of software that translated a program written in BASIC into a machine language readable by a microcomputer's central processing unit (CPU). As I could not even afford to buy a domestic microcomputer, I joined the physics club. After school, I would use the computer in the physics laboratory and wrote a program in BASIC language to run on it. This brought me a great deal of happiness.

However, with a microcomputer driven by an 8-bit CPU, only drawing and games using simple numbers and symbols could be executed in the BASIC language, while it was difficult to run proper game software such as Space Invaders. As the program ran while translating the language, the created graphics moved only slowly, flickering on the screen, as if floating. I therefore tried my best to write a program in machine language, but all that happened was that the microcomputer ran out of control. I was stunned.

When I entered university, microcomputers were renamed personal computers, and ran on a 16-bit CPU. When I turned on my computer in the computer room to analyze macroeconomic data for my graduation thesis, the name Microsoft appeared on the screen again. Now Microsoft offered a personal computer operating system called a Disk Operating System (MS-DOS).

As a working adult, I bought an Apple PowerBook that I had been longing for. It had a 32-bit CPU. I was trying to bring my PowerBook as a personal-use device to a workplace that only had one PC in each section (now that I think about it, that was a peaceful era). Computers with a graphical user interface had convenient application software for creating documents and tallying numbers, but they were supplied by Microsoft.

After that, with the advent of the Internet, computers changed from centralized processing to distributed client-server systems, and there was much buzz about word “multimedia” and mobile phones became popular. In the 2000s, as fixed and wireless communications infrastructures became faster, an environment was created in which network services could be used anywhere.

Now, as I work from home, I have two smartphones, one tablet, and two laptops. It's unclear whether all the programs that are connected to the network and running on the screen are running on your device or somewhere in the cloud across the network. Every day, as emails and messages pop up and I roam the network to gather information, I am sometimes flabbergasted to see recommendations on the screen.

In the old days, you always knew where stand-alone computers were, whether they were at hand or in a place connected by a wire. Programs used to be closed off from the world in such computers. Nowadays, computer programs, network services, and information are all mixed together, and it's not clear how much is closed off into our own worlds. In addition, some programs are running behind services, and it is becoming difficult to know who made them. If a program runs out of control, you don't know where the reset button is.

How should we ensure the trustworthiness of digital system services? As the digitalization of work and life accelerates, we are unknowingly becoming more dependent on programs. Countries and regions are increasingly discussing rules for using data and developing AI and programs. At the very least, there is no doubt that there will be increasing demand for literacy (knowledge for proper understanding and utilization) among both those who make and those who use the programs.