Perceptron¶
graph LR
%% Inputs
X1([x₁])
X2([x₂])
B([1])
%% Neuron and output
SUM([∑])
F([f])
Y([yᵢ])
%% Labeled edges only
X1 -- w₁ --> SUM
X2 -- w₂ --> SUM
B -- b --> SUM
SUM --> F --> Y
Parts¶
- This is the summation \(z = w_1x_1 + w_2x_2 + b\)
- In Activation Function We can use any kind of functions like Step Function, Sigmoid Function
Formula¶
Info
The summation for a simple model can be expressed as: \(z = w_1x_1 + w_2x_2 + b\)
Activation Function : In Activation Function We can use any kind of functions like Step Function, Sigmoid Function
Summation Examples
iq | cgpa | Placed |
---|---|---|
78 | 55 | 1 |
45 | 34 | 0 |
Let's assume the following weights and bias:
- \(w_1 = 1\)
- \(w_2 = 2\)
-
\(b = 6\)
- If the value is \(\ge\) then placed
- If the value is \(<\) then not placed
Neuron Vs Perceptron¶
Interpretation¶
Geometric Intuition¶
Explanation
-
Linear Combination / Pre-activation (\(Z\)):
\[Z = w_1 x_1 + w_2 x_2 + b\] -
Step Activation Function (\(Y\)):
\[Y = f(Z) = \begin{cases} 1 & Z \geq 0 \\ 0 & Z < 0 \end{cases}\] -
General Form of a Line (Substitution):
Let \(w_1 = A, w_2 = B, b = C\) and \(x_1 = x, x_2 = Y\),
then:
\[\text{General form of line: } Ax + By + C = 0\] -
Region Classification based on the Line:
\[\begin{cases} \geq 0 & \text{+ve Region} \\ < 0 & \text{-ve Region} \end{cases}\]
Limitation¶
- It works only Linear and sort of linear
- Not work on non-linear line