Algorithm calculation force data relation
the birth and development of modern computer before the advent of modern computer, the development of computer has gone through three stages: mechanical computer, electromechanical computer and electronic computer
as early as the 17th century, a group of European mathematicians began to design and manufacture digital computers that perform basic operations in digital form. In 1642, Pascal, a French mathematician, made the earliest decimal adder by using a gear transmission similar to clocks and watches. In 1678, Leibniz, a German mathematician, developed a computer to further solve the multiplication and division of decimal numbers
British mathematician Babbage put forward an idea when he made the model of difference machine in 1822. One arithmetic operation at a time will develop into a certain complete operation process automatically. In 1884, Babbage designed a program-controlled universal analyzer. Although this analyzer has described the rudiment of the program control computer, it can not be realized e to the technical conditions at that time< During the more than 100 years since Babbage's idea was put forward, great progress has been made in electromagnetics, electrotechnics and electronics, and vacuum diodes and vacuum triodes have been successively invented in components and devices; In terms of system technology, wireless telegraph, television and radar were invented one after another. All these achievements have prepared technical and material conditions for the development of modern computer< At the same time, mathematics and physics are developing rapidly. In the 1930s, all fields of physics experienced the stage of quantification. The mathematical equations describing various physical processes, some of which were difficult to solve by classical analysis methods. As a result, numerical analysis has been paid attention to, and various numerical integration, numerical differentiation, and numerical solutions of differential equations have been developed. The calculation process has been reced to a huge amount of basic operations, thus laying the foundation of modern computer numerical algorithm
the urgent need for advanced computing tools in society is the fundamental driving force for the birth of modern computers. Since the 20th century, there have been a lot of computational difficulties in various fields of science and technology, which has hindered the further development of the discipline. Especially before and after the outbreak of the Second World War, the need for high-speed computing tools in military science and technology is particularly urgent. During this period, Germany, the United States and the United Kingdom started the research of electromechanical computer and electronic computer almost at the same time<
Giuseppe in Germany was the first to use electrical components to make computers. The fully automatic relay computer Z-3, which he made in 1941, has the characteristics of modern computer, such as floating-point counting, binary operation, instruction form of digital storage address and so on. In the United States, the relay computers mark-1, mark-2, model-1, model-5 and so on were made successively from 1940 to 1947. However, the switching speed of the relay is about one hundredth of a second, which greatly limits the computing speed of the computer
the development process of electronic computer has experienced the evolution from making components to whole machine, from special machine to general machine, from "external program" to "stored program". In 1938, the Bulgarian American scholar atanasov first made the computing unit of the electronic computer. In 1943, the communications office of the British Foreign Office made the "giant" computer. This is a special cryptanalysis machine, which was used in the Second World War< In February 1946, ENIAC, a large-scale electronic digital integrator computer, was developed by Moore College of the University of Pennsylvania in the United States. At first, ENIAC was also specially used for artillery trajectory calculation. Later, it was improved many times and became a general-purpose computer capable of various scientific calculations. This computer, which uses electronic circuit to perform arithmetic operation, logic operation and information storage, is 1000 times faster than relay computer. This is the first electronic computer in the world. However, the program of this kind of computer is still external, the storage capacity is too small, and it has not fully possessed the main characteristics of modern computer
the new breakthrough was completed by a design team led by mathematician von Neumann. In March 1945, they published a new general electronic computer scheme of stored program - electronic discrete variable automatic computer (EDVAC). Then in June 1946, von Neumann and others put forward a more perfect design report "preliminary study on the logical structure of electronic computer devices". From July to August of the same year, they taught a special course "theory and technology of electronic computer design" for experts from more than 20 institutions in the United States and Britain at Moore college, which promoted the design and manufacture of stored program computers< In 1949, the Mathematics Laboratory of Cambridge University in England took the lead in making EDSAC; The United States made the eastern standard automatic computer (SFAC) in 1950. At this point, the embryonic period of the development of electronic computer came to an end, and the development period of modern computer began
at the same time of creating digital computer, we also developed another kind of important computing tool analog computer. When physicists summarize the laws of nature, they often use mathematical equations to describe a process; On the contrary, the process of solving mathematical equations may also adopt the physical process simulation method. After the invention of logarithm, the slide rule made in 1620 has changed multiplication and division into addition and subtraction for calculation. Maxwell skillfully transformed the calculation of integral (area) into the measurement of length, and made the integrator in 1855< Fourier analysis, another great achievement of mathematical physics in the 19th century, played a direct role in promoting the development of simulators. In the late 19th century and the early 20th century, a variety of analytical machines for calculating Fourier coefficients and differential equations were developed. However, when trying to popularize the differential analysis machine to solve partial differential equations and use the simulator to solve general scientific calculation problems, people graally realize the limitations of the simulator in the aspects of universality and accuracy, and turn their main energy to the digital computer
after the advent of electronic digital computer, analog computer still continues to develop, and hybrid computer is proced by combining with digital computer. Simulators and mixers have become special varieties of modern computers, that is, efficient information processing tools or simulation tools used in specific fields
since the middle of the 20th century, the computer has been in a period of high-speed development. The computer has developed from a hardware only system to a computer system which includes three subsystems: hardware, software and firmware. The performance price ratio of computer system is increased by two orders of magnitude every 10 years. The types of computers have been divided into microcomputers, minicomputers, general-purpose computers (including giant, large and medium-sized computers), and various special computers (such as various control computers and analog-to-digital hybrid computers)
computer devices, from electron tubes to transistors, from discrete components to integrated circuits to microprocessors, have made three leaps in the development of computers< In the period of electron tube computer (1946-1959), computers were mainly used for scientific calculation. Main memory is the main factor that determines the appearance of computer technology. At that time, the main memory included mercury delay line memory, cathode ray oscilloscope electrostatic memory, magnetic drum and magnetic core memory, which were usually used to classify computers.
in terms of computing power, we know that after we have data, we need to train and train constantly. Because it's not good to train the training set from the beginning to the end. It's just like saying a truth to a child. I'm sure I won't learn it again, except for the prodigy who never forgets. In addition to training, AI actually needs to run on hardware and reasoning, all of which need computing support
so artificial intelligence must have computing power, and with the development of more and more intelligence, more and stronger computing power is needed.
There is not much difference between artificial intelligence and traditional programming. The only difference is that it needs a lot of data and computing power to fit the model
AI = big data (data) + algorithm (deep learning, rule-based, knowledge-based, statistics based, etc., mostly recursive loop structure) + computing power (very high computing power, Intelligent algorithm can work better)
traditional software programming = data structure (a small amount of data relative to AI) + algorithm (the algorithm is not too complex relative to the machine, recursive operation is less) + computing power (not too much computing power)
3D simulation software = data structure (medium data relative to common application software) + algorithm (similar to AI algorithm, but different from it, Relatively speaking, most of the differences are not recursive or matrix operation) + medium computing power (the computing power required by 3D simulation software is not low, but it is lower than AI algorithm, but it is higher than ordinary application software. Of course, some special application software may also be higher than 3D software, However, in most cases, the requirements of 3D software are relatively high)
here, I believe we all understand that the artificial intelligence program is not much different from ordinary software! The difference lies in the understanding of the algorithm! Traditional programming is more based on logic operation! But the algorithm of artificial intelligence includes logic operation, and more complex modeling and fitting algorithm! Just understand linear algebra thoroughly! AI algorithm is not out of reach
Second, big data has laid the foundation for an intelligent society. The development of artificial intelligence needs three foundations, namely data, computing power and algorithm, so big data is of great significance to the development of artificial intelligence. At present, in the field of artificial intelligence, the application effect has been improved obviously, one of the important reasons is that there are a lot of data support, which will comprehensively promote the training process and verification process of the algorithm, so as to improve the application effect of the algorithm
thirdly, big data promotes the digitalization of social resources. The development of big data makes data proce greater value. This process will greatly promote the digitalization process of social resources. After more social resources are digitalized, the functional boundary of big data will be constantly expanded, thus driving a series of innovations based on big data
finally, one of the important reasons why big data is important is that big data has opened up a new value field, and big data will graally become an important proction material. It can even be said that big data will be an emerging energy in the intelligent society.
Some people say that artificial intelligence (AI) is the future, artificial intelligence is science fiction, and artificial intelligence is also a part of our daily life. These evaluations can be said to be correct, depending on which kind of artificial intelligence you are referring to
earlier this year, Google deep mind's alphago beat South Korea's go Master Li Shiyu. When the media described the victory of deepmind, AI, machine learning and deep learning were used. All three played a role in alphago's defeat of Li Shixuan, but what they said was not the same thing
Today, we will use the simplest method concentric circles to visually show the relationship and application of the threeturn left | turn right
artificial neural networks is an important algorithm in early machine learning, after decades of ups and downs. The principle of neural networks is inspired by the physiological structure of our brain - the interconnected neurons. But different from the fact that a neuron in the brain can connect any neuron within a certain distance, the artificial neural network has discrete layers, connections and the direction of data transmission
for example, we can cut an image into image blocks and input them to the first layer of neural network. Each neuron in the first layer passes data to the second layer. The second layer of neurons do the same thing, passing data to the third layer, and so on, until the last layer, and then generate the results
Each neuron assigns a weight to its input, which is directly related to the task it performs. The final output is determined by the sum of these weightswe still take the stop sign as an example. All the elements of a stop sign image are broken, and then "checked" with neurons: octagonal shape, red color like a fire engine, prominent letters, typical size of traffic signs and motionless characteristics, etc. The task of neural network is to give a conclusion whether it is a stop sign or not. According to all the weights, the neural network will give a well thought out guess - "probability vector"
in this example, the system may give such a result: 86% may be a stop sign; 7% may be a speed limit sign; 5% of it could be a kite hanging on a tree and so on. Then the network structure tells the neural network whether its conclusion is correct
even in this case, it is quite advanced. Until recently, neural networks have been forgotten by AI circles. In fact, in the early days of artificial intelligence, neural network already existed, but the contribution of neural network to "intelligence" is very little. The main problem is that even the most basic neural network needs a lot of computation. It is difficult to meet the operation requirements of neural network algorithm
however, some devout research teams, represented by Geoffrey Hinton of the University of Toronto, insist on the research and realize the operation and proof of concept of parallel algorithm with supercomputing as the goal. But it was not until GPU was widely used that these efforts were effective
let's go back to this stop sign recognition example. Neural network is molated and trained, and it is easy to make mistakes from time to time. What it needs most is training. It takes hundreds or even millions of images to train, until the input weights of neurons are molated very accurately, and the correct results can be obtained every time whether there is fog, sunny day or rainy day
only at this time can we say that the neural network has successfully learned the appearance of a stop sign; Or in the Facebook app, neural networks learn from your mother's face; Or in 2012, Professor Andrew ng realized the neural network in Google to learn the appearance of cats and so on
Professor Wu's breakthrough lies in the fact that these neural networks have been greatly enlarged. There are many layers and neurons, and then input massive data to the system to train the network. In Professor Wu's case, the data are images from 10 million YouTube Videos. Professor Wu added "deep" to deep learning. The "depth" here means that there are many layers in the neural networknow, image recognition after deep learning training can even do better than human in some scenes: from recognizing cats, to identifying early components of cancer in blood, to identifying tumors in MRI. Google's alphago first learned how to play go, then trained with itself. The way it trains its own neural network is to constantly play chess with itself, repeatedly and never stop
| deep learning gives a bright future to AI
deep learning enables machine learning to realize many applications, and expands the field of AI. Deep learning realizes all kinds of tasks in a decadent way, which makes all the auxiliary functions of machines possible. Driverless cars, preventive health care, and even better movie recommendations are all around the corner or on the way
AI is now, tomorrow. With deep learning, AI can even reach the level of science fiction that we imagine. I'll take your C-3PO. I wish you had your terminator
at the same time, as data is officially recognized as a new factor of proction by the central government, it is bound to put forward higher and stricter standards for privacy and security
however, the crux of privacy, data leakage and data monopoly lies in the centralized data processing mode under traditional deep learning
in order to balance data privacy and data value mining, tongn technology has put forward a set of theoretical framework of "knowledge Federation", which supports Federation from four levels of information layer, model layer, cognitive layer and knowledge layer, so as to realize data availability and invisibility. In this way, it breaks the data barriers of participants and makes full use of the data of each participant, At the same time, it can ensure that the data does not leave the participants to protect data privacy.
algorithm, computing power and data are the three most important components in the field of artificial intelligence. For a long time, AI enterprises have focused more on the field of algorithm and computing power, but not on the field of data< However, with the acceleration of AI commercialization, more and more enterprises begin to realize the importance of data. Almost all the data required by machine learning come from the data annotation instry, and the quality of the data set directly affects the quality of the final model
it can be said that data annotation is the cornerstone of the whole artificial intelligence instry. In the future, the data annotation instry will develop in the direction of refinement, scene and customization