Advertisement
Home / Series / Crash Course Artificial Intelligence / Aired Order / Season 1 / Episode 4

Training Neural Networks

Today we’re going to talk about how neurons in a neural network learn by getting their math adjusted, called backpropagation, and how we can optimize networks by finding the best combinations of weights to minimize error. Then we’ll send John Green Bot into the metaphorical jungle to find where this error is the smallest, known as the global optimal solution, compared to just where it is relatively small, called local optimal solutions, and we'll discuss some strategies we can use to help neural networks find these optimized solutions more quickly.

English
  • Originally Aired August 30, 2019
  • Runtime 10 minutes
  • Production Code lgKrup5oi_A
  • Network YouTube
  • Created August 31, 2019 by
    Administrator admin
  • Modified August 31, 2019 by
    Administrator admin