Videos

Adam Optimization Algorithm (C2W2L08)



DeepLearningAI

Take the Deep Learning Specialization: http://bit.ly/2vBG4xl
Check out all our courses: https://www.deeplearning.ai
Subscribe to The Batch, our weekly newsletter: https://www.deeplearning.ai/thebatch

Follow us:
Twitter: https://twitter.com/deeplearningai_
Facebook: https://www.facebook.com/deeplearningHQ/
Linkedin: https://www.linkedin.com/company/deeplearningai

Source

Similar Posts

24 thoughts on “Adam Optimization Algorithm (C2W2L08)
  1. This video is closely related to the video "Bias Correction of Exponentially Weighted Averages". Please revisit that video if you feel this is too confusing.

  2. Clarification about Adam Optimization

    Please note that at 2:44, the Sdb equation is correct. However, from 2:48 , the db² lost the ².

    The bottom right equation should still be:

    Sdb = β₂Sdb + (1 – β₂)db²

  3. Could anyone give me a list of the notations he mentions in the video or direct me towards a video that has those explained? Main issue with understanding the concept in the video is the lack of explanation of the notations used.

  4. The very best and most succinct explanation of ADAM I've ever seen. Things become crystal clear if one watches L06 to L08 in a row.

  5. Hey there I know I am late to the party but I have a pressing question the rest of the internet has failed to answer so far.
    I currently have to work with a model and network I didn't design and my job is to basically find out whats wrong so naturally I need to understand the LOC used.
    There was a line I havent found any example for: optimizer = keras.optimizers.Adam(0.002, 0.5)
    I am still studying so I am not that well versed in Keras or anything AI so far really but I wanna know if this second value refers to the beta_1 or any other value I am not noticing.
    The documentation has me puzzled so far so I hope theres someone here who can answer this.

Comments are closed.

WP2Social Auto Publish Powered By : XYZScripts.com