-
Notifications
You must be signed in to change notification settings - Fork 1
/
03-decisiontrees.tex
67 lines (58 loc) · 2.45 KB
/
03-decisiontrees.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
\documentclass[MASTER.tex]{subfiles}
\begin{document}
%=======================================================================%
\begin{frame}
\LARGE
\textbf{Decision Trees}
\begin{itemize}
\item Decision Trees (DTs) are a non-parametric supervised learning method used for classification and regression. \item The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features.
\end{itemize}
\end{frame}
\begin{frame}
A decision tree is a simple representation for classifying examples. Decision tree learning is one of the most successful techniques for supervised classification learning[citation needed]. For this section, assume that all of the features have finite discrete domains, and there is a single target feature called the classification. Each element of the domain of the classification is called a class.
\end{frame}
%=======================================================================%
\begin{frame}
\LARGE
\textbf{Decision Trees}
\begin{itemize}
\item A decision tree or a classification tree is a tree in which each internal (non-leaf) node is labeled with an input feature.
\item The arcs coming from a node labeled with a feature are labeled with each of the possible values of the feature.
\item Each leaf of the tree is labeled with a class or a probability distribution over the classes.
\end{itemize}
\end{frame}
%=======================================================================%
\begin{frame}
\LARGE
\textbf{Decision Trees}
\begin{itemize}
\item For instance, in the example below, decision trees learn from data to approximate a sine curve with a set of \textit{If-Then-Else} decision rules.
\item The deeper the tree, the more complex the decision rules and the fitter the model.
\end{itemize}
\end{frame}
%========================================================================= %
\begin{frame}[fragile]
\LARGE
\textbf{Decision Trees}\\
Using the Iris dataset, we can construct a tree as follows:
{
\normalsize
\begin{framed}
\begin{verbatim}
>>> from sklearn.datasets import load_iris
>>> from sklearn import tree
>>> iris = load_iris()
>>> clf = tree.DecisionTreeClassifier()
>>> clf = clf.fit(iris.data, iris.target)
\end{verbatim}
\end{framed}
}
\end{frame}
%========================================================================= %
\begin{frame}
\begin{figure}
\centering
\includegraphics[width=1.1\linewidth]{SVMexplain}
\end{figure}
\end{frame}
\end{document}