Academic
  • Introduction
  • Artificial Intelligence
    • Introduction
    • AI Concepts, Terminology, and Application Areas
    • AI: Issues, Concerns and Ethical Considerations
  • Biology
    • Scientific Method
    • Chemistry of Life
    • Water, Acids, Bases
    • Properties of carbon
    • Macromolecules
    • Energy and Enzymes
    • Structure of a cell
    • Membranes and transport
    • Cellular respiration
    • Cell Signaling
    • Cell Division
    • Classical and molecular genetics
    • DNA as the genetic material
    • Central dogma
    • Gene regulation
  • Bioinformatics
    • Bioinformatics Overview
  • Deep Learning
    • Neural Networks and Deep Learning
      • Introduction
      • Logistic Regression as a Neural Network
      • Python and Vectorization
      • Shallow Neural Network
      • Deep Neural Network
    • Improving Deep Neural Networks
      • Setting up your Machine Learning Application
      • Regularizing your Neural Network
      • Setting up your Optimization Problem
      • Optimization algorithms
      • Hyperparameter, Batch Normalization, Softmax
    • Structuring Machine Learning Projects
    • Convolutional Neural Networks
      • Introduction
    • Sequence Models
      • Recurrent Neural Networks
      • Natural Language Processing & Word Embeddings
      • Sequence models & Attention mechanism
  • Linear Algebra
    • Vectors and Spaces
      • Vectors
      • Linear combinations and spans
      • Linear dependence and independence
      • Subspaces and the basis for a subspace
      • Vector dot and cross products
      • Matrices for solving systems by elimination
      • Null space and column space
    • Matrix transformations
      • Functions and linear transformations
      • Linear transformation examples
      • Transformations and matrix multiplication
      • Inverse functions and transformations
      • Finding inverses and determinants
      • More Determinant Depth
  • Machine Learning
    • Introduction
    • Linear Regression
      • Model and Cost Function
      • Parameter Learning
      • Multivariate Linear Regression
      • Computing Parameters Analytically
      • Octave
    • Logistic Regression
      • Classification and Representation
      • Logistic Regression Model
    • Regularization
      • Solving the Problem of Overfitting
    • Neural Networks
      • Introduction of Neural Networks
      • Neural Networks - Learning
    • Improve Learning Algorithm
      • Advice for Applying Machine Learning
      • Machine Learning System Design
    • Support Vector Machine
      • Large Margin Classification
      • Kernels
      • SVM in Practice
  • NCKU - Artificial Intelligence
    • Introduction
    • Intelligent Agents
    • Solving Problems by Searching
    • Beyond Classical Search
    • Learning from Examples
  • NCKU - Computer Architecture
    • First Week
  • NCKU - Data Mining
    • Introduction
    • Association Analysis
    • FP-growth
    • Other Association Rules
    • Sequence Pattern
    • Classification
    • Evaluation
    • Clustering
    • Link Analysis
  • NCKU - Machine Learning
    • Probability
    • Inference
    • Bayesian Inference
    • Introduction
  • NCKU - Robotic Navigation and Exploration
    • Kinetic Model & Vehicle Control
    • Motion Planning
    • SLAM Back-end (I)
    • SLAM Back-end (II)
    • Computer Vision / Multi-view Geometry
    • Lie group & Lie algebra
    • SLAM Front-end
  • Python
    • Numpy
    • Pandas
    • Scikit-learn
      • Introduction
      • Statistic Learning
  • Statstics
    • Quantitative Data
    • Modeling Data Distribution
    • Bivariate Numerical Data
    • Probability
    • Random Variables
    • Sampling Distribution
    • Confidence Intervals
    • Significance tests
Powered by GitBook
On this page

Was this helpful?

  1. Linear Algebra
  2. Vectors and Spaces

Linear combinations and spans

PreviousVectorsNextLinear dependence and independence

Last updated 5 years ago

Was this helpful?

向量加上任意 實數 scalar 後,並且透過加法組合在一起時,產生一個 線性 的組合即為 Linear combination

a1v1⃗+a2v2⃗+⋯+anvn⃗.∣an∈Ra_1\vec{v_1} + a_2\vec{v_2}+ \cdots + a_n\vec{v_n}. | a_n \in \mathbb{R}a1​v1​​+a2​v2​​+⋯+an​vn​​.∣an​∈R

舉個例子,以下兩個向量經過與 scalar 相乘後,可以組合並且表示一個新的向量

3[12]+2[03]=[312]3 \begin{bmatrix} 1\\ 2\end{bmatrix} +2 \begin{bmatrix} 0\\ 3\end{bmatrix}= \begin{bmatrix} 3\\ 12\end{bmatrix}3[12​]+2[03​]=[312​]

因為 Vectors 可以與任意實數相乘,產生的 linear combination 就可以任意表示其他向量,這個現象叫作 Span

例如以下兩個向量不管 a1 和 a2 為何,在 combine 之後只能 span 這兩條向量原本的那條線

a1[12]+a2[24]=a1[12]+2a2[12]=a1+2a2[12]=a[12]\begin{aligned} a_1 \begin{bmatrix} 1\\ 2\end{bmatrix}+ a_2\begin{bmatrix} 2\\ 4\end{bmatrix}&=\\ a_1 \begin{bmatrix} 1\\ 2\end{bmatrix}+ 2a_2\begin{bmatrix} 1\\2\end{bmatrix}&=\\ a_1 +2a_2 \begin{bmatrix} 1\\ 2\end{bmatrix}&= a\begin{bmatrix} 1\\ 2\end{bmatrix} \end{aligned}a1​[12​]+a2​[24​]a1​[12​]+2a2​[12​]a1​+2a2​[12​]​===a[12​]​

而底下兩個向量 (1,0) 和 (0,1) 卻可以 span 整個二維平面的任意兩點,我們可以這樣表示

v1⃗=(1,0),v2⃗=(0,1)a1v1⃗+a2v2⃗=R2\begin{aligned} &\vec{v_1} = (1,0), \vec{v_2} = (0,1)\\ &a_1\vec{v_1} + a_2\vec{v_2} = \mathbb{R}^2 \end{aligned}​v1​​=(1,0),v2​​=(0,1)a1​v1​​+a2​v2​​=R2​
https://youtu.be/Qm_OS-8COwU