Зарегистрироваться
Восстановить пароль
FAQ по входу

Машинное обучение (Machine Learning)

B
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, Dr. Thomas Gottron, Dr. Florian Lemmerich, Dr. Christoph Kling, Prof. Dr. Steffen Staab, 2019, 33 p. K-means Expectation maximization DBSCAN Agglomerative hierarhial clustering
  • №1
  • 984,92 КБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 50 p. Defining task Designing features Preprocessing Outlier removal Feature scaling Feature correlation measurement Missing data Class imbalance problem
  • №2
  • 2,01 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 47 p. Context-dependent classification Markov chain Hidden Markov model Recognition Decoding Training Data dimension reduction Principal component analysis Singular value decomposition
  • №3
  • 1,72 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 72 p. Data dimension reduction Principal component analysis Singular value decomposition Clustering Unsupervised learning Evaluate clusters Intrinsic and extrinsic evaluation measures How K-mean works Choosing K EM algorithm
  • №4
  • 4,29 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 47 p. Defining task Designing features Preprocessing Outlier removal Feature scaling Feature correlation measurement Missing data Class imbalance problem k-nearest Classification Regression K-D tree Overfitting Evaluation Confusion matrix Precision Recall...
  • №5
  • 1,82 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 43 p. Class imbalance problem k-nearest Classification Regression K-D tree Overfitting How to choose the best K Evaluation Confusion matrix Precision Recall F-Score Overall accuracy Bayesian classification Bayes theorem Naive Bayes
  • №6
  • 1,78 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 47 p. How to evaluate a classification model Bayesian classification Bayes theorem Maximum Likelihood Estimation Maximum a Posteriori Probability Estimation Bayesian classification Naïve Bayes Decision tree
  • №7
  • 2,08 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 36 p. Naïve Bayes Decision tree More about decision trees Pruning Random forest Underfitting, overfitting, and generalization
  • №8
  • 1,96 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 45 p. Decision tree Random forest Linear regression Least squares function Optimization Linear classification Perceptron classifier Support vector machines(SVM) Optimization
  • №9
  • 2,04 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 52 p. Linear regression Perceptron classifier Support vector machine Non-separable cases Non linearly separable cases Neural network From perceptron to one layer perceptron Multi-layer perceptron Optimization
  • №10
  • 2,68 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 56 p. XOR problem Two-layer perceptron How the network learns Gradient descend Backpropagation Activation functions Sigmoid Softmax Linear Mixture density ReLu Loss functions Binary cross-entropy Discrete cross-entropy Gaussian cross-entropy Cross-entropy
  • №11
  • 3,32 МБ
  • добавлен
  • описание отредактировано
The lecture, Institute for Web Science and Technologies, University of Koblenz-Landau, Germany, Dr. Zeyd Boukhers, 2019, 42 p. Deep neural networks Context-dependent classification Markov chain Hidden Markov model Recognition Decoding Training Viterbi algorithm
  • №12
  • 1,62 МБ
  • добавлен
  • описание отредактировано
S
University of Würzburg, Germany, Univ.-Prof. Dr. rer. nat. Ingo Scholtes, 2022, 76 p. Introducing the Chair of Informatics XV. - Research interests; - Teaching portfolio; Machine Learning for Graph-Structured Data. - what is ML; - supervised vs unsupervised; - ML for euclidean data; - learning in graph-structured data; - geometric ML; Interdisciplinary Applications of Graph...
  • №13
  • 23,38 МБ
  • добавлен
  • описание отредактировано
The lecture, University of Würzburg, Germany, Univ.-Prof. Dr. rer. nat. Ingo Scholtes, 2022, 72 p. Motivation Network Adjacency matrix Undirected networks Self-loops Weighted networks Node degrees Weighted node degree Visualizing networks Walks, paths, and cycles Powers of adjacency matrices Topological distance Finding shortest paths Connected components From components to...
  • №14
  • 3,72 МБ
  • добавлен
  • описание отредактировано
The lecture, University of Würzburg, Germany, Univ.-Prof. Dr. rer. nat. Ingo Scholtes, 2022, 60 p. Generative Models and Statistical Ensembles G(n, m) random graph model G(n, p) random graph model Degree distribution of the random graph Random graphs with given degrees Generative models and likelihood Statistical inference in networks
  • №15
  • 4,41 МБ
  • добавлен
  • описание отредактировано
В этом разделе нет файлов.

Комментарии

В этом разделе нет комментариев.