Academic
  • Introduction
  • Artificial Intelligence
    • Introduction
    • AI Concepts, Terminology, and Application Areas
    • AI: Issues, Concerns and Ethical Considerations
  • Biology
    • Scientific Method
    • Chemistry of Life
    • Water, Acids, Bases
    • Properties of carbon
    • Macromolecules
    • Energy and Enzymes
    • Structure of a cell
    • Membranes and transport
    • Cellular respiration
    • Cell Signaling
    • Cell Division
    • Classical and molecular genetics
    • DNA as the genetic material
    • Central dogma
    • Gene regulation
  • Bioinformatics
    • Bioinformatics Overview
  • Deep Learning
    • Neural Networks and Deep Learning
      • Introduction
      • Logistic Regression as a Neural Network
      • Python and Vectorization
      • Shallow Neural Network
      • Deep Neural Network
    • Improving Deep Neural Networks
      • Setting up your Machine Learning Application
      • Regularizing your Neural Network
      • Setting up your Optimization Problem
      • Optimization algorithms
      • Hyperparameter, Batch Normalization, Softmax
    • Structuring Machine Learning Projects
    • Convolutional Neural Networks
      • Introduction
    • Sequence Models
      • Recurrent Neural Networks
      • Natural Language Processing & Word Embeddings
      • Sequence models & Attention mechanism
  • Linear Algebra
    • Vectors and Spaces
      • Vectors
      • Linear combinations and spans
      • Linear dependence and independence
      • Subspaces and the basis for a subspace
      • Vector dot and cross products
      • Matrices for solving systems by elimination
      • Null space and column space
    • Matrix transformations
      • Functions and linear transformations
      • Linear transformation examples
      • Transformations and matrix multiplication
      • Inverse functions and transformations
      • Finding inverses and determinants
      • More Determinant Depth
  • Machine Learning
    • Introduction
    • Linear Regression
      • Model and Cost Function
      • Parameter Learning
      • Multivariate Linear Regression
      • Computing Parameters Analytically
      • Octave
    • Logistic Regression
      • Classification and Representation
      • Logistic Regression Model
    • Regularization
      • Solving the Problem of Overfitting
    • Neural Networks
      • Introduction of Neural Networks
      • Neural Networks - Learning
    • Improve Learning Algorithm
      • Advice for Applying Machine Learning
      • Machine Learning System Design
    • Support Vector Machine
      • Large Margin Classification
      • Kernels
      • SVM in Practice
  • NCKU - Artificial Intelligence
    • Introduction
    • Intelligent Agents
    • Solving Problems by Searching
    • Beyond Classical Search
    • Learning from Examples
  • NCKU - Computer Architecture
    • First Week
  • NCKU - Data Mining
    • Introduction
    • Association Analysis
    • FP-growth
    • Other Association Rules
    • Sequence Pattern
    • Classification
    • Evaluation
    • Clustering
    • Link Analysis
  • NCKU - Machine Learning
    • Probability
    • Inference
    • Bayesian Inference
    • Introduction
  • NCKU - Robotic Navigation and Exploration
    • Kinetic Model & Vehicle Control
    • Motion Planning
    • SLAM Back-end (I)
    • SLAM Back-end (II)
    • Computer Vision / Multi-view Geometry
    • Lie group & Lie algebra
    • SLAM Front-end
  • Python
    • Numpy
    • Pandas
    • Scikit-learn
      • Introduction
      • Statistic Learning
  • Statstics
    • Quantitative Data
    • Modeling Data Distribution
    • Bivariate Numerical Data
    • Probability
    • Random Variables
    • Sampling Distribution
    • Confidence Intervals
    • Significance tests
Powered by GitBook
On this page

Was this helpful?

  1. Linear Algebra
  2. Vectors and Spaces

Linear dependence and independence

PreviousLinear combinations and spansNextSubspaces and the basis for a subspace

Last updated 5 years ago

Was this helpful?

有了 linear combination 和 span 的知識後,很簡單就可以了解 Linear dependence / independence 的意義,

當你想要 span 某個東西時,使用不多不少剛剛好的向量來表示即為 Linear Independence

例如使用 (1, 0) 和 (0, 1) 來 span 整個 R2 平面時,這兩個向量即為線性獨立

而在 span 時,使用了 多餘 的向量,這個向量和本來就足夠的向量,形成了 Linear Dependence

例如本來只用 (1, 0) 和 (0, 1) 已經可以 span R2 ,

但我又加了向量 (1, 3) 來形成 R2 平面, (1, 3) 和另外兩個向量即為線性依賴

  • 正式一點的定義為

    • 當 linear combination 中有任意一個向量可以被表示為其他向量的加總時

    • 或是某個向量的 scalar 不為 0 卻可以讓整個 linear combination 變為 0 時

    • 即為 Linear dependence

Linear dependent  ⟺  a1v1⃗+a2v2⃗+⋯+anvn⃗=0=[0⋮0]  ⟺  for some ai , not all are zero, at least one non-zerov1=a2v2+a3v3+⋯+anvn  ⟺  \begin{aligned} \text{Linear dependent} &\iff a_1\vec{v_1} + a_2\vec{v_2}+\cdots+a_n\vec{v_n} = 0 = \begin{bmatrix} 0\\\vdots\\0 \end{bmatrix}\\ &\iff \text{for some }a_i \text{ , not all are zero, at least one non-zero}\\ v_1 = a_2v_2 + a_3v_3+\cdots+a_nv_n&\iff \end{aligned}Linear dependentv1​=a2​v2​+a3​v3​+⋯+an​vn​​⟺a1​v1​​+a2​v2​​+⋯+an​vn​​=0=​0⋮0​​⟺for some ai​ , not all are zero, at least one non-zero⟺​

上面的定義可以很好的用來檢驗向量間為 linear dependence or linear independence

例如要檢測下面兩個向量是否有 linear dependence

v1⃗=(2,1)v2⃗=(3,2)a1v1⃗+a2v2⃗=0=[00]a1[21]+a2[32]=[00]\begin{aligned} \vec{v_1} &= (2,1)\\ \vec{v_2} &= (3,2) \\ a_1\vec{v_1}+a_2\vec{v_2} &= 0 = \begin{bmatrix} 0\\0 \end{bmatrix}\\ a_1 \begin{bmatrix}2\\1\end{bmatrix} + a_2 \begin{bmatrix}3\\2\end{bmatrix}&=\begin{bmatrix}0\\0\end{bmatrix} \end{aligned}v1​​v2​​a1​v1​​+a2​v2​​a1​[21​]+a2​[32​]​=(2,1)=(3,2)=0=[00​]=[00​]​

再將其拆開驗證即可

{2a1+3a2=0a1+2a2=0a1=0a2=0\left\{\begin{matrix} 2a_1 + 3a_2 &= 0\\ a_1 + 2a_2 &= 0\\ \end{matrix}\right.\\ a_1 = 0 \\a_2 =0{2a1​+3a2​a1​+2a2​​=0=0​a1​=0a2​=0

可以得到結果, v1 和 v2 為 linear independence !

3 demension example:

https://youtu.be/CrV1xCWdY-g
https://youtu.be/Alhcv5d_XOs
https://youtu.be/9kW6zFK5E5c