Academic
  • Introduction
  • Artificial Intelligence
    • Introduction
    • AI Concepts, Terminology, and Application Areas
    • AI: Issues, Concerns and Ethical Considerations
  • Biology
    • Scientific Method
    • Chemistry of Life
    • Water, Acids, Bases
    • Properties of carbon
    • Macromolecules
    • Energy and Enzymes
    • Structure of a cell
    • Membranes and transport
    • Cellular respiration
    • Cell Signaling
    • Cell Division
    • Classical and molecular genetics
    • DNA as the genetic material
    • Central dogma
    • Gene regulation
  • Bioinformatics
    • Bioinformatics Overview
  • Deep Learning
    • Neural Networks and Deep Learning
      • Introduction
      • Logistic Regression as a Neural Network
      • Python and Vectorization
      • Shallow Neural Network
      • Deep Neural Network
    • Improving Deep Neural Networks
      • Setting up your Machine Learning Application
      • Regularizing your Neural Network
      • Setting up your Optimization Problem
      • Optimization algorithms
      • Hyperparameter, Batch Normalization, Softmax
    • Structuring Machine Learning Projects
    • Convolutional Neural Networks
      • Introduction
    • Sequence Models
      • Recurrent Neural Networks
      • Natural Language Processing & Word Embeddings
      • Sequence models & Attention mechanism
  • Linear Algebra
    • Vectors and Spaces
      • Vectors
      • Linear combinations and spans
      • Linear dependence and independence
      • Subspaces and the basis for a subspace
      • Vector dot and cross products
      • Matrices for solving systems by elimination
      • Null space and column space
    • Matrix transformations
      • Functions and linear transformations
      • Linear transformation examples
      • Transformations and matrix multiplication
      • Inverse functions and transformations
      • Finding inverses and determinants
      • More Determinant Depth
  • Machine Learning
    • Introduction
    • Linear Regression
      • Model and Cost Function
      • Parameter Learning
      • Multivariate Linear Regression
      • Computing Parameters Analytically
      • Octave
    • Logistic Regression
      • Classification and Representation
      • Logistic Regression Model
    • Regularization
      • Solving the Problem of Overfitting
    • Neural Networks
      • Introduction of Neural Networks
      • Neural Networks - Learning
    • Improve Learning Algorithm
      • Advice for Applying Machine Learning
      • Machine Learning System Design
    • Support Vector Machine
      • Large Margin Classification
      • Kernels
      • SVM in Practice
  • NCKU - Artificial Intelligence
    • Introduction
    • Intelligent Agents
    • Solving Problems by Searching
    • Beyond Classical Search
    • Learning from Examples
  • NCKU - Computer Architecture
    • First Week
  • NCKU - Data Mining
    • Introduction
    • Association Analysis
    • FP-growth
    • Other Association Rules
    • Sequence Pattern
    • Classification
    • Evaluation
    • Clustering
    • Link Analysis
  • NCKU - Machine Learning
    • Probability
    • Inference
    • Bayesian Inference
    • Introduction
  • NCKU - Robotic Navigation and Exploration
    • Kinetic Model & Vehicle Control
    • Motion Planning
    • SLAM Back-end (I)
    • SLAM Back-end (II)
    • Computer Vision / Multi-view Geometry
    • Lie group & Lie algebra
    • SLAM Front-end
  • Python
    • Numpy
    • Pandas
    • Scikit-learn
      • Introduction
      • Statistic Learning
  • Statstics
    • Quantitative Data
    • Modeling Data Distribution
    • Bivariate Numerical Data
    • Probability
    • Random Variables
    • Sampling Distribution
    • Confidence Intervals
    • Significance tests
Powered by GitBook
On this page

Was this helpful?

  1. Linear Algebra
  2. Vectors and Spaces

Subspaces and the basis for a subspace

PreviousLinear dependence and independenceNextVector dot and cross products

Last updated 5 years ago

Was this helpful?

Subspace V=span(v1⃗,v2⃗,⋯vn⃗){v1⃗,v2⃗,⋯vn⃗} is linear independencethenS={v1⃗,v2⃗,⋯vn⃗}S is a Basis for V\begin{aligned} \text{Subspace }\mathbf{V} = span&\left(\vec{v_1}, \vec{v_2}, \cdots \vec{v_n} \right)\\ &\begin{Bmatrix} \vec{v_1}, \vec{v_2}, \cdots \vec{v_n} \end{Bmatrix} \text{ is linear independence} \\\\ \text{then}\\\\ \mathbf{S} &= \begin{Bmatrix} \vec{v_1}, \vec{v_2}, \cdots \vec{v_n} \end{Bmatrix} \\ \mathbf{S} &\text{ is a } \bold{Basis} \text{ for }\mathbf{V} \end{aligned}Subspace V=spanthenSS​(v1​​,v2​​,⋯vn​​){v1​​,v2​​,⋯vn​​​} is linear independence={v1​​,v2​​,⋯vn​​​} is a Basis for V​
  • 若利用 Minimum set of vectors 來 span 該 subspace V

  • 也就是 span subspace V 的向量都是 linear independence 時

  • 這些向量的集合稱為該 Subspace 的 Basis

我們舉個例子 T

T={[10],[01]}\mathbf{T} = \begin{Bmatrix} \begin{bmatrix}1\\0\end{bmatrix}, \begin{bmatrix}0\\1\end{bmatrix} \end{Bmatrix}T={[10​],[01​]​}

首先他可以 span R2 子空間

c1[10]+c2[01]=[x1x2]c1+0=x1,c1=x10+c2=x2,c2=x2\begin{aligned} c_1\begin{bmatrix} 1\\0\end{bmatrix} + c_2\begin{bmatrix} 0\\1\end{bmatrix} &= \begin{bmatrix} x_1\\x_2\end{bmatrix} \\ c_1 + 0 = x_1, c_1 &= x_1\\ 0 + c_2 = x_2, c_2 &= x_2\\ \end{aligned}c1​[10​]+c2​[01​]c1​+0=x1​,c1​0+c2​=x2​,c2​​=[x1​x2​​]=x1​=x2​​

並且他為 linear independence

c1[10]+c2[01]=[00]c1+0=0,c1=00+c2=0,c2=0\begin{aligned} c_1\begin{bmatrix} 1\\0\end{bmatrix} + c_2\begin{bmatrix} 0\\1\end{bmatrix} &= \begin{bmatrix} 0\\0\end{bmatrix} \\ c_1 + 0 = 0, c_1 &= 0\\ 0 + c_2 = 0, c_2 &= 0\\ \end{aligned}c1​[10​]+c2​[01​]c1​+0=0,c1​0+c2​=0,c2​​=[00​]=0=0​

所以 T 為 R2 的 basis (而且是 standard basis)

而這些 vectors 所生成的任一個向量在 Subspace 中都是獨一無二的 :

因為相減還是在 subspace 裡面,並且滿足 basis (linear independent),所以 :

證明了生成的向量為唯一

a⃗∈V,a⃗=c1v1⃗+c2v2⃗+⋯+cnvn⃗a⃗=d1v1⃗+d2v2⃗+⋯+dnvn⃗ (subtract)0⃗=(c1−d1)v1⃗+(c2−d2)v1⃗+⋯+(cn−dn)vn⃗\begin{aligned} \vec{a} &\in \mathbf{V}, \\ \vec{a} &= c_1\vec{v_1} + c_2\vec{v_2} + \cdots + c_n\vec{v_n} \\ \vec{a} &= d_1\vec{v_1} + d_2\vec{v_2} + \cdots + d_n\vec{v_n} \text{ (subtract)} \\ \vec{0} &= (c_1-d_1)\vec{v_1} + (c_2-d_2)\vec{v_1} + \cdots + (c_n-d_n)\vec{v_n} \end{aligned}aaa0​∈V,=c1​v1​​+c2​v2​​+⋯+cn​vn​​=d1​v1​​+d2​v2​​+⋯+dn​vn​​ (subtract)=(c1​−d1​)v1​​+(c2​−d2​)v1​​+⋯+(cn​−dn​)vn​​​
c1−d1=0c2−d2=0cn−dn=0cn=dn\begin{aligned} c_1 - d_1 &= 0 \\ c_2 - d_2 &= 0 \\ c_n - d_n &= 0 \\ c_n &= d_n \end{aligned}c1​−d1​c2​−d2​cn​−dn​cn​​=0=0=0=dn​​