Machine learning algorithm and blockchain
In recent years, the impact of black technology AI can be described as wave after wave. From dark blue to alphago, people have repeatedly exclaimed at the subtlety of technology and the powerful learning ability of AI
so how will AI combine with blockchain, and what breakthroughs and innovations will it bring
let's look at the evolution of blockchain. From blockchain 1.0 marked by BTC to blockchain 2.0 marked by Ethereum and combined with "smart contract", to blockchain 3.0 marked by EOS with stronger expansibility
as far as BTC is concerned, its function is single. For users, it is nothing more than mining and transfer. So we have an upgraded eth
eth provides a development platform for users to write smart contract publishing applications, but its performance is insufficient and the handling fee is expensive. This is bm's EOS
one of the main reasons why EOS can get people's attention quickly is that it has stronger scalability than Ethereum. It's stable and secure, but "super node power leads to insufficient decentralization" is no more decentralized than eth. In other words, in order to achieve fast enough data processing, it makes a compromise on the degree of decentralization. Then Velas appeared
based on the characteristics of EOS, Velas uses AI to enhance and realize decentralization. By introcing AI (Artificial Intelligence) into the blockchain technology, it solves the disadvantages brought by the rule of man, and automatically adjusts the system through AI to proce the best results without affecting the processing standards of the system, thus recing the cost of consensus
"AI is technological innovation, blockchain is institutional innovation"
blockchain focuses on keeping accurate records, authentication and execution, while AI helps to make decisions, evaluate and understand certain patterns and data sets, and ultimately proces autonomous interaction. AI and blockchain share several characteristics, which can ensure seamless interaction in the near future. Three main features are listed below
1. AI and blockchain need data sharing
distributed database emphasizes the importance of data sharing among multiple clients on a specific network. Similarly, AI relies on big data, especially data sharing. The more open data available for analysis, the more accurate the prediction and evaluation of the machine, and the more reliable the generated algorithm
2. Security
when dealing with high-value transactions on the blockchain network, there are great requirements for the security of the network. This can be implemented through existing agreements. For artificial intelligence, the autonomy of the machine also needs high security, in order to rece the possibility of catastrophic events
3. Trust is a necessary condition
for any widely accepted technological progress, there is no greater threat than the lack of trust, and artificial intelligence and blockchain cannot be ruled out. In order to make the communication between machines more convenient, we need to have an expected trust level. In order to execute certain transactions on the blockchain network, trust is a necessary condition
the impact of blockchain and artificial intelligence on ordinary people
in short, blockchain is a community-based technology, which can make value exchange more secure. Blockchain is just like its name. Each block contains an encrypted transaction record. The blocks are arranged in chronological order, and the security is guaranteed by a cryptographic system. Blockchain is a technology that can change rules, and its emergence is a revolutionary innovation
there are many functions of blockchain, and the specific application is also very wide. For example: if blockchain is used in the food instry, people will no longer worry about eating harmful food. If blockchain is used for diamond proction, consumers will no longer have to worry about buying fake diamonds. If the blockchain is used in the ecation instry, intellectual property protection can be strengthened. If the blockchain is used in the insurance instry, it can alleviate the information asymmetry of the insurance business and help to improve the security of the insurance business
the impact of blockchain and artificial intelligence on ordinary people is enormous. Just imagine, if blockchain and artificial intelligence are combined, will their role expand? Yes, the combination of the two can really have more impact to change the lives of ordinary people
blockchain and artificial intelligence are two extremely important roles in the field of technology, which bring convenience to our proction and life. If we find an intelligent way to make them work together, the impact of their interaction is unimaginable. This is also the core of OMT. After the combination of these two technologies, the future application scenarios are revolutionary and exciting. In the new ecological construction, data storage, sharing mechanism, platform problems, security problems, etc. can be overcome by using each other's technologies. OMT will create maximum value for global users and enterprises and bring more convenience to ordinary people through blockchain + artificial intelligence technology
the answer is over, hope to adopt, thank you!
In recent years, blockchain and artificial intelligence have been very popular
first of all, blockchain is to establish a decentralized network. The so-called decentralization means that the network does not belong to you or me
it belongs to everyone
Artificial intelligence is a new technology science that researches and develops the theory, method, technology and application system for simulating, extending and expanding human intelligence. In other words, "machine self-learning"in this way, we can think about the combination of blockchain and artificial intelligence
first of all, we need to understand that blockchain can be divided into three stages
However, in the first three stages, there are some problems, such as the lack of decentralization, low scalability, the mismatching between the incentive of the blocker and the best profit of the whole network, and the network always running at the maximum capacity. Serious waste of resources and rece efficiencyso can we combine artificial intelligence with underlying public chain technology to solve these problems
The answer is yes! And there has been team research and development, and has made a certain progressVelas is a public chain that enhances its consensus algorithm through artificial intelligence (AI) optimized neural network, and carries out self-learning and self optimization. It is committed to improving the security, interoperability and high scalability of the transfer process and intelligent contract. Velas adopts the dpos consensus enhanced by AI, which completely realizes decentralization without recing the security and transaction speed. Not only that, AI chooses who to mortgage the token according to the needs of the blockchain; Velas only blocks when needed; Every 1 second to every 2 minutes; Scalability (up to 30000 TPS); Block procers are selected by artificial intuition
Introction to scikit learn
linear regression, ridge regression, Lasso regression
matrix decomposition and dimension rection (PCA)
logical regression, KNN
integration algorithm: AdaBoost, random forest Xgboost
naive Bayes
decision tree
clustering
support vector machine
association rules and sequential patterns
neural network
if you want to learn systematically, you can go to CDA data analyst
1、 It refers to different
1, machine learning algorithm: it is a multidisciplinary interdisciplinary, involving probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and other disciplines
Deep learning is a new research direction in the field of machine learning. It is introced into machine learning to make it closer to the original goal of artificial intelligence Machine learning algorithm: the basic structure of learning system. The environment provides some information to the learning part of the system. The learning part uses the information to modify the knowledge base, so as to improve the efficiency of the executive part of the system to complete the task. The executive part completes the task according to the knowledge base, and feeds back the information to the learning part2. Deep learning: through the design and establishment of an appropriate amount of neuron computing nodes and multi-layer operation hierarchy, select the appropriate input layer and output layer, through the network learning and optimization, establish the functional relationship from input to output, although it can not find the functional relationship between input and output 100%, but it can be as close to the reality as possible
Three, different applications1, machine learning algorithm: data mining, computer vision, natural language processing, biometrics, search engine, medical diagnosis, DNA sequencing, speech and handwriting recognition, strategy game and robot application
Deep learning: computer vision, speech recognition, natural language processing and other fields0 = sin (0)
....
1 = sin (90)
in essence, Machine learning algorithm is to summarize the "rule" in the known training data, and then use the "rule" to calculate the unknown data, but ordinary algorithm does not have the characteristic of summarizing the rule
if you want to learn machine algorithms, where do you start< Decision tree: decision tree is a kind of decision support tool, which can be used to make decisions and its possible consequences, including the results of random events, resource consumption and utility of the tree or model
from the perspective of business decision, decision tree is a problem that people have to choose yes / no to evaluate the probability of making correct decisions most of the time. It allows you to solve problems in a structured and systematic way to arrive at logical conclusions
2. Naive Bayes Classification: Naive Bayes classifier is a simple probability classifier, which is based on Bayes theorem and has strong (naive) independence assumption between its features
the characteristic image is the equation - P (a | b) is the posterior probability, P (B | a) is the likelihood, P (a) is the class prior probability, and P (b) is the prediction prior probability< Some real-world examples are:
judge whether the mail is spam
classification technology, the news article atmosphere, politics or sports
check a paragraph of text expressing positive or negative emotions
for facial recognition software
3. Ordinary least squares regression: if you know statistics, you may have heard of linear regression. Least squares is a method of performing linear regression
you can regard linear regression as the task of fitting a line through a point distribution. There are many possible strategies to do this, and the "ordinary least squares" strategy is like this - you can draw a line and add each data point, the vertical distance between the measurement point and the line; The fit line will be as small as possible from the sum of the distances
linearity refers to the model you are using to cater to the data, while least squares can minimize the linear model error< Logistic regression: logistic regression is a powerful statistical method, which uses one or more explanatory variables to model binomial results. It measures the relationship between classification dependent variable and one or more independent variables by using logic function to estimate probability, which is cumulative logic distribution
logistic regression is used in daily life:
credit rating
measure the success rate of marketing activities
predict the revenue of a proct
will there be an earthquake one day
5. Support vector machine: SVM is a binary classification algorithm. Given two kinds of points in n-dimensional space, SVM generates (n-1) dimensional hyperplane and divides these points into two groups
suppose you have two types of points in a paper that can be separated linearly. SVM will find a straight line, divide these points into two types, and as far away from all these points as possible< On the scale, some big problems (including the implementation of appropriate modification) solved by SVM are: advertising, human gene splice site recognition, image-based gender detection, large-scale image classification...
6. Ensemble method: the ensemble method is to construct a group of learning algorithms of classifiers, and then classify new data points by weighted voting on their predictions. The original integration method is Bayesian average method, but the updated algorithm includes error correction output coding, bagging and boosting
so how do integration methods work and why are they superior to indivial models< Balance bias: if you balance a large number of Pro Democratic votes with a large number of Pro Republican votes, you will always get a less biased result
variance rection: if a large number of model reference results are collected, the noise will be less than that of a single model. In finance, this is known as diversification - a portfolio of many stocks with less volatility than indivial stocks
it is not likely to over fit: if you have a single model that is not fully fitted and you combine each model in a simple way (average, weighted average, logistic regression), then generally no over fitting will occur
unsupervised learning
7. Clustering algorithm: clustering is the task of grouping a group of objects, so that the objects in the same group (cluster) are more similar to each other than those in other groups< Each clustering algorithm is different, for example:
centroid based algorithm
connection based algorithm
density based algorithm
probability
dimension rection
neural network / deep learning
8. Principal component analysis: PCA is a group of statistical processes that use orthogonal transformation to transform the observed values of possible correlated variables into linear uncorrelated variables of main components
some applications of PCA include data compression, data simplification, easy learning and visualization. Please note that domain knowledge is very important in choosing whether to continue to use PCA. The case of noisy data (all components of PCA are very different) is not applicable
9. Singular value decomposition: in linear algebra, SVD is the factorization of a truly complex matrix. For a given m * n matrix M, there is a decomposition such that M = U Σ 5. Where u and V are unitary matrices, Σ It's a diagonal matrix
PCA is actually a simple application of SVD. In computer vision technology, the first face recognition algorithm uses PCA and SVD to represent the face as a linear combination of "feature face" for dimensionality rection, and then uses a simple method to match the face to the identity; Although this approach is more complex, it still relies on similar techniques< Independent component analysis (ICA) is a statistical technique used to reveal the hidden factors of random variables, measurements or signal sets. ICA defines the generation model of observed multivariate data, which is usually used as a large sample database
in the model, it is assumed that the data variable is a linear mixture of some unknown potential variables, and the hybrid system is also unknown. Latent variables are assumed to be non Gaussian and mutually independent, which are called independent components of observation data
ICA is related to PCA, but it is a more powerful technology, which can find the potential source factors when these classical methods fail completely. Its application includes digital image, document database, economic index and psychological measurement.
or how are the algorithms classified? First of all, the amount of data processed is different. For example, a traditional road planning problem involves hundreds of objects, which is a normal phenomenon. Now the speed of data generation is too fast and there are too many data. For a network optimization problem, the objects involved may be several hundred million, such as Facebook. But this still can't answer my first question, even if it's a network of millions or 100000 objects, such as dating websites. In this order of magnitude, it will involve recommendation algorithm. The recommended method is done by probability model, and can learn some results by machine learning; So, for an object of the same magnitude, will a graph theory algorithm be needed to solve any problem? I think machine learning mainly lies in different ways of thinking and more open attitude. Some graph theory algorithms I know are solutions that have a very stable understanding of the overall situation. For example, an online machine learning algorithm, whose prediction results directly affect the generation of new data. Basically, this method is reliable. For any global algorithm, we can look at it with an open eye, that is, we can use machine learning method to apply it to new suitable problems supported by a large number of data
the difficulty of this era is that we can't use our brains to fully understand a problem in a moment. We connect our brains together, and we also need more powerful tools to understand unprecedented problems. For example, from ancient times, it is difficult to understand the complex problems of several events. We use Venn diagram to clearly show the logical relationship between more than four or five events. Now there are billions of people. We don't know how many events are related to each other. It's not enough to use Venn's diagram. But we can always find the right entry point to outline the overall things. Our tool has become higher mathematics, reliable matrix operation. So, I tend to see machine learning as a reliable way to help us understand new things, and the tools it uses come from our reliable mathematical point of view
therefore, the idea of machine learning is the most important. We can expand our way of understanding the world under the guidance of any existing reliable point of view. I want to explain it as a formation mechanism of group wisdom. Why is it group wisdom? As an indivial, I don't need to recognize 10000 faces corresponding to their names, but as a company, I need to recognize my customers and say hello to them in one second to provide services. In other words, we live in an era in which group wisdom plays an immeasurable role. When you look around, you can see that most of the objects you use are not from people you know and are close to. In fact, it is also the growth of ideas. China's small-scale peasant economy has been self-sufficient for a long time. If you eat other people's food and wear other people's weaving and cutting, it would make you very uncomfortable at that time. What's the difference between a streetscape car that now repels Google to protect its privacy? In the history closer to us, more is the spread of the concrete procts of group wisdom. Now, more direct is the spread of group wisdom
philosophy behind machine learning should be such an open and future oriented attitude, which I quite agree with. I also hope to tap the wisdom of the group and proce unprecedented business value.