Academic
  • Introduction
  • Artificial Intelligence
    • Introduction
    • AI Concepts, Terminology, and Application Areas
    • AI: Issues, Concerns and Ethical Considerations
  • Biology
    • Scientific Method
    • Chemistry of Life
    • Water, Acids, Bases
    • Properties of carbon
    • Macromolecules
    • Energy and Enzymes
    • Structure of a cell
    • Membranes and transport
    • Cellular respiration
    • Cell Signaling
    • Cell Division
    • Classical and molecular genetics
    • DNA as the genetic material
    • Central dogma
    • Gene regulation
  • Bioinformatics
    • Bioinformatics Overview
  • Deep Learning
    • Neural Networks and Deep Learning
      • Introduction
      • Logistic Regression as a Neural Network
      • Python and Vectorization
      • Shallow Neural Network
      • Deep Neural Network
    • Improving Deep Neural Networks
      • Setting up your Machine Learning Application
      • Regularizing your Neural Network
      • Setting up your Optimization Problem
      • Optimization algorithms
      • Hyperparameter, Batch Normalization, Softmax
    • Structuring Machine Learning Projects
    • Convolutional Neural Networks
      • Introduction
    • Sequence Models
      • Recurrent Neural Networks
      • Natural Language Processing & Word Embeddings
      • Sequence models & Attention mechanism
  • Linear Algebra
    • Vectors and Spaces
      • Vectors
      • Linear combinations and spans
      • Linear dependence and independence
      • Subspaces and the basis for a subspace
      • Vector dot and cross products
      • Matrices for solving systems by elimination
      • Null space and column space
    • Matrix transformations
      • Functions and linear transformations
      • Linear transformation examples
      • Transformations and matrix multiplication
      • Inverse functions and transformations
      • Finding inverses and determinants
      • More Determinant Depth
  • Machine Learning
    • Introduction
    • Linear Regression
      • Model and Cost Function
      • Parameter Learning
      • Multivariate Linear Regression
      • Computing Parameters Analytically
      • Octave
    • Logistic Regression
      • Classification and Representation
      • Logistic Regression Model
    • Regularization
      • Solving the Problem of Overfitting
    • Neural Networks
      • Introduction of Neural Networks
      • Neural Networks - Learning
    • Improve Learning Algorithm
      • Advice for Applying Machine Learning
      • Machine Learning System Design
    • Support Vector Machine
      • Large Margin Classification
      • Kernels
      • SVM in Practice
  • NCKU - Artificial Intelligence
    • Introduction
    • Intelligent Agents
    • Solving Problems by Searching
    • Beyond Classical Search
    • Learning from Examples
  • NCKU - Computer Architecture
    • First Week
  • NCKU - Data Mining
    • Introduction
    • Association Analysis
    • FP-growth
    • Other Association Rules
    • Sequence Pattern
    • Classification
    • Evaluation
    • Clustering
    • Link Analysis
  • NCKU - Machine Learning
    • Probability
    • Inference
    • Bayesian Inference
    • Introduction
  • NCKU - Robotic Navigation and Exploration
    • Kinetic Model & Vehicle Control
    • Motion Planning
    • SLAM Back-end (I)
    • SLAM Back-end (II)
    • Computer Vision / Multi-view Geometry
    • Lie group & Lie algebra
    • SLAM Front-end
  • Python
    • Numpy
    • Pandas
    • Scikit-learn
      • Introduction
      • Statistic Learning
  • Statstics
    • Quantitative Data
    • Modeling Data Distribution
    • Bivariate Numerical Data
    • Probability
    • Random Variables
    • Sampling Distribution
    • Confidence Intervals
    • Significance tests
Powered by GitBook
On this page
  • Solve Linear System with matrix row-echelon form
  • linear systems

Was this helpful?

  1. Linear Algebra
  2. Vectors and Spaces

Matrices for solving systems by elimination

Solve Linear System with matrix row-echelon form

我們可以利用矩陣之力,將 Linear system 轉為矩陣快速解出答案

{x1+2x2+x3+x4=7x1+2x2+2x3−x4=122x1+4x2+6x4=4\left\{\begin{matrix} x_1+2x_2+x_3+x_4=7\\ x_1+2x_2+2x_3-x_4=12\\ 2x_1+4x_2+6x_4=4 \end{matrix}\right.⎩⎨⎧​x1​+2x2​+x3​+x4​=7x1​+2x2​+2x3​−x4​=122x1​+4x2​+6x4​=4​

可以轉為 Augmented matrix

A=[12117122−11224064]A=\begin{bmatrix} \begin{array}{cccc|c} 1&2&1&1&7\\ 1&2&2&-1&12\\ 2&4&0&6&4\\ \end{array}\end{bmatrix}A=​112​224​120​1−16​7124​​​

將矩陣運算至 Reduced Row-echelon form

  • 紅色的為 Leading 1s 只在該列有他一個 1 存在,該元素又稱為 pivot value

  • 而藍色的為 free variables

[12032001−2500000]=rref(A)\begin{bmatrix} \begin{array}{cccc|c} \color{red}1&\color{blue}2&0&\color{blue}3&2\\ 0&0&\color{red}1&\color{blue}-2&5\\ 0&0&0&0&0\\ \end{array}\end{bmatrix} = \text{rref}(A)​100​200​010​3−20​250​​​=rref(A)

我們可以將結果轉回 equations

{x1+2x2+3x4=2x3−2x4=5⇒ {x1=2−2x2−3x4x3=5+2x4\begin{aligned} &\left\{\begin{matrix} x_1+2x_2+3x_4=2\\ x_3-2x_4=5 \end{matrix}\right.\\\\ \Rightarrow\,& \left\{\begin{matrix} x_1 = 2-2x_2-3x_4\\ x_3 = 5 + 2x_4 \end{matrix}\right.\\ \end{aligned}⇒​{x1​+2x2​+3x4​=2x3​−2x4​=5​{x1​=2−2x2​−3x4​x3​=5+2x4​​​

並且可以表示成像 linear combination 的形式

[x1x2x3x4]=[2050]+x2[−2100]+x4[−3021]\begin{bmatrix}x_1\\x_2\\x_3\\x_4\end{bmatrix} = \begin{bmatrix}2\\0\\5\\0\end{bmatrix} + x_2\begin{bmatrix}-2\\1\\0\\0\end{bmatrix}+ x_4\begin{bmatrix}-3\\0\\2\\1\end{bmatrix}​x1​x2​x3​x4​​​=​2050​​+x2​​−2100​​+x4​​−3021​​

在圖形上看起來像是這樣

linear systems

若你的 reduced-row echelon form 算到變成這樣時

[12034001−240000−4]\begin{bmatrix} \begin{array}{cccc|c} 1&2&0&3&4\\ 0&0&1&-2&4\\ \color{red}0&\color{red}0&\color{red}0&\color{red}0&\color{red}-4\\ \end{array}\end{bmatrix}​100​200​010​3−20​44−4​​​

表示你的三個 R4 向量在空間內是沒有交集的,所以是無解 (no solution)

而每一個 leading ones 都可以對應一個值,這樣子代表唯一解 (uniqle solution)

[1xxxa01xxb001xc0001d]\begin{bmatrix} \begin{array}{cccc|c} 1&x&x&x&a\\ 0&1&x&x&b\\ 0&0&1&x&c\\ 0&0&0&1&d\\ \end{array}\end{bmatrix}​1000​x100​xx10​xxx1​abcd​​​

而上面的例題中,含有 free variables 的,代表沒有唯一解,也就是無限多解

[12032001−2500000]\begin{bmatrix} \begin{array}{cccc|c} \color{red}1&\color{blue}2&0&\color{blue}3&2\\ 0&0&\color{red}1&\color{blue}-2&5\\ 0&0&0&0&0\\ \end{array}\end{bmatrix}​100​200​010​3−20​250​​​
PreviousVector dot and cross productsNextNull space and column space

Last updated 5 years ago

Was this helpful?