Joke Collection Website - Public benefit messages - Why can't N73 connect to the computer?

Why can't N73 connect to the computer?

[Edit this paragraph] Digital circuit implementation of computer

The physical realization of these conceptual designs is varied. As we mentioned earlier, stored program computers can be either mechanical or based on digital electronics. The digital circuit can realize arithmetic and logic operations using binary numbers by electronically controlling switches such as relays. Shannon's paper just shows us how to arrange relays and form logic gates that can realize simple Boolean operations. Other scholars quickly pointed out that vacuum tubes can replace relay circuits. Vacuum tubes were originally used as amplifiers in radio circuits, and later they began to be used more and more as fast switches in digital electronic circuits. When one pin of an electron tube is energized, current can flow freely between the other two ends.

Through the arrangement and combination of logic gates, we can design and complete many complex tasks. For example, an adder is one of them. The device realizes the addition of two numbers in the electronic field and saves the result-in computer science, such a method of achieving a specific intention through a set of operations is called an algorithm. Finally, people successfully assembled a complete ALU and controller through a considerable number of logic gates. It is a considerable number, just look at CSIRAC, which may be the smallest practical electron tube computer. The machine contains 2000 electron tubes, many of which are dual-purpose devices, which means there are 2000 to 4000 logic devices in total.

Vacuum tubes obviously cannot make large-scale gate circuits. Expensive, unstable (especially in large quantities), bloated, high energy consumption, and not fast enough-although far beyond mechanical switching circuits. All these led to their replacement by transistors in the 1960s. The latter has smaller volume, convenient operation, high reliability, more energy saving and lower cost.

Integrated circuit is the foundation of today's electronic computer. After the 1960s, transistors began to be gradually replaced by integrated circuits, which put a large number of transistors, other electronic components and connecting wires on a silicon board. In 1970s, ALU and controller, as two parts of CPU, began to be integrated into one chip, called "microprocessor". Along the development history of integrated circuits, we can see that the number of integrated devices on a chip increases rapidly. The first integrated circuit only contained dozens of components. By 2006, the number of transistors on an Intel Core dual-core processor was as high as 1.5 1 billion.

Whether it is an electron tube, a transistor or an integrated circuit, it can be used as a "storage" component in the storage program architecture by using the trigger design mechanism. In fact, flip-flops are indeed used as small-scale ultra-high-speed storage. However, almost no computer design uses triggers for large-scale data storage. The earliest computers used Williams tubes to send electron beams to TV screens or several mercury delay lines (sound waves travel slowly enough to be considered "stored" on them), and then read them. Of course, these effective but elegant methods were eventually replaced by magnetic storage. For example, magnetic core memory, the current representing information can generate a permanent weak magnetic field in iron material, and when this magnetic field is read out again, data recovery is realized. Dynamic random access memory (DRAM) was also invented. It is an integrated circuit containing a large number of capacitors, which are responsible for storing data charges-the intensity of the charge is defined as the value of the data.

[Edit this paragraph] Input and output devices

Input/output (I/O) refers to the devices that send external information to the computer and return the processing results to the outside world. These returned results may be intuitively experienced by the user, or may be the input of other devices controlled by the computer: for a robot, the output of the control computer is basically the robot itself, such as making various behaviors.

The types of input and output devices of the first generation computers were very limited. The usual input device is a card reader with punched cards, which is used to input instructions and data into the memory; The output device used to store the results is usually a magnetic tape. With the progress of science and technology, the richness of input and output equipment has been improved. Take a personal computer as an example: keyboard and mouse are the main tools for users to input information directly into the computer, while monitors, printers, speakers and headphones return the processing results. In addition, there are many input devices that can accept other different kinds of information, such as digital cameras that can input images.

Among the input and output devices, there are two types worthy of attention: the first type is secondary storage devices, such as hard disks, optical disks or other devices with slow speed but large capacity. The second is the computer network access equipment, through which the direct data transmission between computers greatly enhances the value of computers. Today, the Internet has enabled tens of millions of computers to transmit various types of data to each other.

[Edit this paragraph] program

Simply put, a computer program is a series of instructions executed by a computer. It may be just a few instructions to perform a simple task, or it may have to operate a complex instruction queue containing a large amount of data. Many computer programs contain millions of instructions, many of which may be executed repeatedly. In 2005, a typical personal computer could execute about 3 billion instructions per second. Computers usually don't execute complicated instructions to get extra functions, but run simple but numerous short instructions according to the programmer's arrangement. Generally speaking, programmers don't directly write instructions to computers in machine language. The result of this can only be time-consuming, laborious, inefficient and full of loopholes. Therefore, programmers usually write programs in "high-level" languages, and then some special computer programs, such as interpreters or compilers, translate them into machine language. Some programming languages look like machine languages, such as assembly language, and are considered as low-level languages. Other languages, such as Prolog, which is an abstract principle, completely ignore the operational details of the actual operation of the computer, and can be described as high-level languages. For a specific task, we should choose the corresponding language according to its transaction characteristics, programmer skills, available tools and customer needs, among which customer needs are the most important (engineering projects in the United States and China usually require the use of Ada language).

Computer software is another word that is not equivalent to computer program. Computer software is a more inclusive technical term, which includes all kinds of programs and all related materials used to complete tasks. For example, electronic games include not only the program itself, but also pictures, sounds and other data content to create a virtual game environment. In the retail market, an application on a computer is just a copy of software for a large number of users. The trite example here is of course Microsoft's office software group, which includes a series of interrelated programs that meet general office needs. Using those extremely simple machine language instructions to realize countless powerful application software means that its programming scale is doomed to be large. Windows XP, an operating system program, contains 40 million lines of C++ high-level language source code. Of course, this is not the biggest. Such a huge software scale also shows the importance of management in the development process. In actual programming, the program will be subdivided into scales that each programmer can complete in an acceptable time. Even so, the process of software development is slow, unpredictable and full of omissions. With the requirements of the times, software engineering focuses on how to speed up the work progress and improve efficiency and quality.

[Edit this paragraph] Library and operating system

Shortly after the birth of computers, people found that some tasks must be performed in many different programs, such as calculating some standard mathematical functions. In order to improve efficiency, the standard versions of these programs are collected into a "library" for each program to call. Many tasks often need to deal with various input and output interfaces. At this time, the library used for connection can come in handy.

In 1960s, with the popularization of computer industrialization, computers were more and more used to handle different tasks in organizations. Soon, special software that can automatically arrange the continuation and execution of jobs appeared. These softwares, which control hardware and are responsible for job scheduling, are called "operating systems". An example of an early operating system was IBM OS/360. In the process of continuous improvement, the operating system introduces a time-sharing mechanism-concurrency. This enables many different users to use the machine to execute their own programs at the same time, as if everyone had their own computer. To this end, the operating system needs to provide a "virtual machine" for each user to separate different programs. As more and more devices need operating system control, one of them is hard disk. Therefore, the operating system also introduces file management and directory management (folder), which greatly simplifies the application of such permanent storage devices. In addition, the operating system is also responsible for security control to ensure that users can only access those files that have been allowed. Of course, so far, the last important step in the development of operating system is to provide a standard graphical user interface (GUI) for programs. Although there is no technical reason why the operating system must provide these interfaces, operating system vendors always hope and encourage the software running on their systems to be consistent or similar to the operating system in appearance and behavior characteristics.

In addition to these core functions, the operating system also encapsulates a series of other commonly used tools. Some of them are of little significance to computer management, but they are very useful to users. For example, Apple's Mac OS X includes a video editing application. The operating system of some small computers may not use such versatility. Due to the limited memory and processing power, early microcomputers did not provide additional functions, while embedded computers used special operating systems or did not use them at all. They often directly express some functions of the operating system through applications.

[Edit this paragraph] Application

Machines controlled by computers are common in industry. Many modern mass-produced toys, such as Furby, are inseparable from cheap embedded processors.

At first, huge and expensive digital computers were mainly used for scientific calculation, especially for military projects. For example, ENIAC was first used to calculate the neutron density of artillery ballistic cross section and design hydrogen bombs (many supercomputers still play a huge role in simulating nuclear tests today). CSIR Mk I is the first stored program computer designed in Australia, which is responsible for evaluating the rainfall in the catchment area of hydropower projects. Others are used for decryption, such as the "Colossus" programmable computer in Britain. In addition to these early scientific or military applications, computers are widely used in other fields. From the beginning, stored program computers have been closely related to the solution of business problems. Long before the birth of IBM's first commercial computer, J. Lyons and others in Britain designed and manufactured LEO for asset management or catering to other commercial purposes. Due to continuous quantity and cost control, computers began to spread to smaller organizations. Coupled with the invention of microprocessors in the 1970s, cheap computers became a reality. In the 1980s, personal computers became popular, and repetitive report operations such as electronic document writing and printing and budget calculation began to rely more and more on computers. As computers become cheaper and cheaper, creative works of art begin to use them. People use synthesizers, computer graphics and animations to create and modify sounds, images and videos. The industrialization of video games also shows that computers have created a new history in entertainment. Since the miniaturization of computers, the control of mechanical equipment has also begun to rely on the support of computers. In fact, it was the construction of an embedded computer small enough to control the Apollo spacecraft that stimulated the leap of integrated circuit technology. Today, it is much more difficult to find an active mechanical device that is not controlled by a computer than one that is even partially controlled by a computer. Perhaps the most famous computer control device is the robot, which has a certain subset of human appearance and human behavior more or less. In mass production, industrial robots have become commonplace. However, fully anthropomorphic robots still exist only in science fiction or laboratories. Robot technology is essentially a physical expression link in the field of artificial intelligence. The so-called artificial intelligence is a vague concept, but what is certain is that this subject tries to make computers have capabilities that are not available at present, but are inherent in human beings. Over the years, many new methods have been developed to make computers do things that used to be thought that only humans could do. Like reading and playing chess. However, up to now, the development of computers with human general "whole" intelligence is still very slow.

With the increasing popularity of computers, computers have entered almost all industries and played a decisive role. Has become an indispensable tool for the normal operation of today's society. Computers occupy such an important position in our modern life. People are so dependent on computers that I can't imagine what our life would be like without them. Therefore, it is essential to learn computer knowledge, master how to use computers and deal with general computer failures.

Computer Common Sense Network provides all kinds of computer basic knowledge such as computer fault handling, computer maintenance, windows system problems, computer assembly, computer hardware, computer software, computer professional system knowledge, computer security defense, virus Trojan horse, computer skills, computer fault analysis, web page making, image design, computer market situation, brand computer introduction, latest computer development trend, IT news, computer purchase skills. Strive to build the strongest computer information station.

[Edit this paragraph] Computer science

In today's world, almost all majors are closely related to computers. However, only some majors and disciplines will study the manufacturing, programming and using technology of the computer itself. The meanings of academic terms used to explain different research fields in computer science are constantly changing, and new disciplines emerge one after another. Computer engineering is a branch of electronic engineering, which mainly studies computer software and hardware and the relationship between them. Computer science is the traditional name of computer academic research. This paper mainly studies the computing technology and efficient algorithm for performing specific tasks. This topic determines whether a problem can be solved for us in the computer field, such as how efficient it is and how to make a more efficient program. So far, there have been many branches of computer science, and each branch has conducted in-depth research on different types of problems. Software engineering pays attention to the methodology and practice of developing high-quality software systems, and tries to compress and predict the development cost and development cycle. Information system studies the computer application in a wide and organized environment (mainly business). Many disciplines are intertwined with other disciplines. For example, GIS experts use computer technology to manage geographic information. There are three large organizations devoted to computer science in the world: Britain.