The Future of Cyber Security, Data and AI: Big Brother or a Brave New World?

Do you remember when there were no computer science departments in universities, and we basically had computer support units? The people…

The Future of Cyber Security, Data and AI: Big Brother or a Brave New World?

Book tickets: https://www.sciencefestival.co.uk/event-details/the-future-of-cyber-security-data-and-ai

Do you remember when there were no computer science departments in universities, and we basically had computer support units? The people who worked there often had white coats — as they were seen as scientists in their secretive labs — and where a 10MB disk unit looked more like a washing machine:

But that all changed in the 1960s, and where computer units jumped out of their computer support functions, and where Computer Science was adopted as a proper subject area.

The University of Edinburgh was one of the first in the UK to undertake that jump, and has since developed a world-leading role in the development of computer science.

So who was Scotland’s first professor of Computer Science?

The great Sidney Michaelson.

As part of the Edinburgh International Science Festival 2019, I am so proud to be presenting the BCS Edinburgh Branch Sidney Michaelson Memorial Lecture.

Here is the outline details:

The Future of Cyber Security, Data and AI: Big Brother or a Brave New World?

Abstract: Our world is changing fast and we are moving into an information age. Unfortunately, many of our systems cannot be properly trusted, and thus our new world provides increased threats against our privacy. Along with this we see the rapid increase in AI and the Internet of Things. So will these integrate seamlessly into our lives, and improve things, or will they become part of a massive spying network. This presentation — full of demonstrations and audience interactions — will outline the risks we face, and also highlight the new opportunities that this information age will bring. It will also outline the mechanisms — such as with blockchain and cryptography — which we will need to see us into this new age [here].

Book tickets here: https://www.sciencefestival.co.uk/event-details/the-future-of-cyber-security-data-and-ai

Sidney and his legacy

Sidney came to Edinburgh at the start of the 1960s, from London, and at a time when computers cost many millions of pounds to buy, and had a massive memory capacity of 1 MB (yes … 1 mega byte). He grew-up in the East End of London, and graduated from Imperial College. When he left London and came to Edinburgh just at the time when the Computer Unit split into a Department of Computing and the Computer Support Unit.

In 1966, he became the first Professor of Computing in the University of Edinburgh, and also led the new Computing Department. Overall, Sidney probably did more to promote Computing as a discipline than most others including working extensively with the BCS. He died on February 21st 1991 at the age of 65.

Around the time running multiple programs at the same time was a major challenge, and one which required a great deal of research. Along with this computers were maintained by technicians with white coats, and who would be the only ones trusted to load the FORTRAN 77 punch cards into the mainframes. Sidney broke new ground with the creation of academic work within computer science.

A Bit of History

One of the first occurrences of computer technology occurred in the USA in the 1880s. It was due to the American Constitution demanding that a survey is undertaken every 10 years. As the population in the USA increased, it took an increasing amount of time to produce the statistics. By the 1880s, it looked likely that the 1880 survey would not be complete until 1890. To overcome this, Herman Hollerith (who worked for the Government) devised a ma-chine which accepted punch cards with information on them. These cards allowed a current to pass through a hole when there was a hole present.

Hollerith’s electromechanical machine was extremely successful and used in the 1890 and 1900 Censuses. He even founded the company that would later become International Business Machines (IBM): CTR (Computer Tabulating Recording). Unfortunately, Hollerith’s business fell into financial difficulties and was saved by a young salesman at CTR, named Tom Watson, who recognized the potential of selling punch card-based calculating machines to American business. He eventually took over the company Watson, and, in the 1920s, he renamed it International Business Machines Corporation (IBM). After this, electro-mechanical machines were speeded up and improved. Electro-mechnical computers would soon lead to electronic computers, using valves.

Figure: Punch cards

After the creation of ENIAC, progress was fast in the computer industry and, by 1948, small electronic computers were being produced in quantity within five years (2000 were in use), in 1961 it was 10000, 1970 100000. IBM, at the time, had a considerable share of the computer market, so much so that a complaint was filed against them alleging monopolistic practices in its computer business, in violation of the Sherman Act. By January 1954, the US District Court made a final judgment on the complaint against IBM. For this, a ‘consent decree’ was then signed by IBM, which placed limitations on how IBM conducts business with respect to ‘electronic data processing machines’.

Figure: ENIAC

In 1954, the IBM 650 was built and was considered the workhorse of the industry at the time (which sold about 1000 machines, and used valves). In November 1956, IBM showed how innovative they were by developing the first hard disk, the RAMAC 305. It was towering by today’s standards, with 50 two-foot diameter platters, giving a total capacity of 5MB. Around the same time, the Massachusetts Institute of Technology produced the first transistorised computer: the TX-O (Transistorized Experimental computer). Seeing the potential of the transistor, IBM quickly switched from valves to transistors and, in 1959, they produced the first commercial transistorised computer. This was the IBM 7090/7094 series, and it dominated the computer market for years.

Figure: RAMAC

In 1959, IBM built the first commercial transistorised computer named the IBM 7090/7094 series, which dominated the computer market for many years. In 1960, in New York, IBM went on to develop the first automatic mass-production facility for transistors. In 1963, the Digital Equipment Company (DEC) sold their first minicomputer, to Atomic Energy of Canada. DEC would become the main competitor to IBM, but eventually fail as they dismissed the growth in the personal computer market.

Figure: DEC VAX 11

The second generation of computers started in 1961 when the great innovator, Fairchild Semiconductor, released the first commercial integrated circuit. In the next two years, significant advances were made in the interfaces to computer systems. The first was by Teletype who produced the Model 33 keyboard and punched-tape terminal. It was a classic design and was on many of the available systems. The other advance was by Douglas Engelbart who received a patent for the mouse-pointing device for computers. The production of transistors increased, and each year brought a significant decrease in their size.

Figure: The first mouse

The third generation of computers started in 1965 with the use of integrated circuits rather than discrete transistors. IBM again was innovative and created the System/360 main-frame. In the course of history, it was a true classic computer. Then, in 1970, IBM introduced the System/370, which included semiconductor memories. All of the computers were very expensive (approx. $1,000,000), and were the great computing workhorses of the time. Unfortunately, they were extremely expensive to purchase and maintain. Most companies had to lease their computer systems, as they could not afford to purchase them.

Figure: IBM System/360

As IBM happily clung to their mainframe market, several new companies were working away to erode their share. DEC would be the first, with their minicomputer, but it would be the PC companies of the future who would finally overtake them. The beginning of their loss of market share can be traced to the development of the microprocessor, and to one company: Intel. In 1967, though, IBM again showed their leadership in the computer industry by developing the first floppy disk. The growing electronics industry started to entice new companies to specialize in key areas, such as International Research who applied for a patent for a method of constructing double-sided magnetic tape utilizing a Mumetal foil inter layer.

The beginning of the slide for IBM occurred in 1968, when Robert Noyce and Gordon Moore left Fairchild Semiconductors and met up with Andy Grove to found Intel Corporation. To raise the required finance they went to a venture capitalist named Arthur Rock. He quickly found the required start-up finance, as Robert Noyce was well known for being the person who first put more than one transistor of a piece of silicon. At the same time, IBM scientist John Cocke and others completed a prototype scientific computer called the ACS, which used some RISC (Reduced Instruction Set Computer) concepts. Unfortunately, the project was cancelled because it was not compatible with the IBM’s System/360 computers.

Figure: First integrated circuit

In 1969, Hewlett-Packard branched into the world of digital electronics with the world’s first desktop scientific calculator: the HP 9100A. At the time, the electronics industry was producing cheap pocket calculators, which led to the development of affordable computers, when the Japanese company Busicom commissioned Intel to produce a set of between eight and 12 ICs for a calculator. Then instead of designing a complete set of ICs, Ted Hoff, at Intel, designed an integrated circuit chip that could receive instructions, and perform simple integrated functions on data. The design became the 4004 microprocessor.

Intel produced a set of ICs, which could be programmed to perform different tasks. These were the first ever microprocessors and soon Intel (short for Integrated Electronics) produced a general-purpose 4-bit microprocessor, named the 4004. In April 1970, Wayne Pickette proposed to Intel that they use the computer-on-a-chip for the Busicom project. Then, in December, Gilbert Hyatt filed a patent application entitled ‘Single Chip Integrated Circuit Computer Architecture’, the first basic patent on the micro-processor.

Figure: 4004 chip

The 4004 caused a revolution in the electronics industry as previous electronic systems had a fixed functionality. With this processor, the functionality could be programmed by software. Amazingly, by today’s standards, it could only handle four bits of data at a time (a nibble), contained 2000 transistors, had 46 instructions and allowed 4KB of program code and 1KB of data. From this humble start, the PC has since evolved using Intel microprocessors. Intel had previously been an innovative company, and had produced the first memory device (static RAM, which uses six transistors for each bit stored in memory), the first DRAM (dynamic memory, which uses only one transistor for each bit stored in memory) and the first EPROM (which allows data to be downloaded to a device, which is then permanent-ly stored).

In the same year, Intel announced the 1KB RAM chip, which was a significant increase over previously produced memory chip. Around the same time, one of Intel’s major partners, and also, as history has shown, competitors, Advanced Micro Devices (AMD) Incorporated was founded. It was started when Jerry Sanders and seven others left — yes, you’ve guessed it, Fairchild Semiconductor. The incubator for the electronics industry was producing many spin-off companies.

At the same time, the Xerox Corporation gathered a team at the Palo Alto Research Center (PARC) and gave them the objective of creating ‘the architecture of information.’ It would lead to many of the great developments of computing, including personal distributed computing, graphical user interfaces, the first commercial mouse, bit-mapped displays, Ethernet, client/server architecture, object-oriented programming, laser printing and many of the basic protocols of the Internet. Few research centers have ever been as creative, and forward thinking as PARC was over those years.

In 1971, Gary Boone, of Texas Instruments, filed a patent application relating to a single-chip computer and the microprocessor was released in November. Also in the same year, Intel copied the 4004 microprocessor to Busicom, and then in 1974, Intel was a truly innovative company, and was the first to develop an 8-bit microprocessor. Excited by the new 8-bit microprocessors, two kids from a private high school, Bill Gates and Paul Allen, rushed out to buy the new 8008 device. This they believed would be the beginning of the end of the large, and expensive, mainframes (such as the IBM range) and minicomputers (such as the DEC PDP range). They bought the processors for the high price of $360 (possibly, a joke at the expense of the IBM System/360 mainframe), but even they could not make it support BASIC programming. Instead, they formed the Traf-O-Data company and used the 8008 to analyse tickertape read-outs of cars passing in a street. The company would close down in the following year (1973) after it had made $20000, but from this enterprising start, one of the leading computer companies in the world would grow: Microsoft (although it would initially be called Micro-soft).

Figure: Traf-o-Data

At the end of the 1970s, IBM’s virtual monopoly on computer systems started to erode from the high-powered end as DEC developed their range of minicomputers and from the low-powered-end by companies developing computers based around the newly available 8-bit micro­processors, such as the 6502 and the Z80. IBM’s main contenders, other than DEC, were Apple and Commodore who introduced a new type of computer — the personal computer (PC). The leading systems, at the time, were the Apple I and the Commodore PET.

Figure: Apple 1 computer

These captured the interest of the home user and for the first time individuals had access to cheap computing power. These flagship computers spawned many others, such as the Sinclair ZX80/ZX81, the BBC microcomputer, the Sinclair Spectrum, the Commodore Vic-20 and the classic Apple II (all of which where based on the 6502 or Z80). Most of these computers were aimed at the lower end of the market and were mainly used for playing games and not for business applications. IBM finally decided, with the advice of Bill Gates, to use the 8088 for its version of the PC, and not, as they had first thought, to use the 8080 device. Microsoft also persuaded IBM to introduce the IBM PC with a minimum of 64KB RAM, instead of the 16KB that IBM planned.

In 1973, the model for future computer systems occurred at Xerox’s PARC, when the Alto workstation was demonstrated with a bit mapped screen (showing the Cookie Monster, from Sesame Street). The following year, at Xerox, Bob Metcalfe demonstrated the Ethernet networking technology, which was destined to become the standard local area networking technique. It was far from perfect, as computers contended with each other for access to the network, but it was cheap and simple, and it worked relatively well.

IBM was also innovating at the time, creating a cheap floppy disk drive. They also produced the IBM 3340 hard disk unit (a Winchester disk) which had a recording head which sat on a cushion of air, 18 millionths of an inch above the platter. The disk was made with four platters, each was 8-inches in diameter, giving a total capacity of 70MB.

The days of IBM leading the field very quickly became numbered as Compaq managed to reverse engineering the software which allowed the operating system to talk to the hardware — BIOS. Once they did this IBM struggled to set standards in the industry, and had several attempts to define new operating systems such as OS/2 and in defining new computer architectures, with MCA bus standard. The industry decided that common standards were more important than ones defined by a single company.

There’s a great demo of the development of computer systems from 1980s to now here.