Please help contribute to the Reddit categorization project here

    finallyifoundvalidUN

    + friends - friends
    75,132 link karma
    1,031 comment karma
    send message redditor for

    [โ€“] [D] One week in San Francisco - where's the best places to go for machine/deep learning communities? finallyifoundvalidUN 1 points ago in MachineLearning

    Are there anyway to get access to class materials online , I'm not be able to attend the discussion due to fact that I live in another country

    [โ€“] [image] aww finallyifoundvalidUN 6 points ago in GetMotivated

    All credit goes to Brad Montague

    More :http://facebook.com/montagueworkshop

    [โ€“] Is it worth the time for someone who is well versed with Deep Learning to take the Andrew Ng course? finallyifoundvalidUN 5 points ago in MachineLearning

    The course is consists of 5 subcourses and you can watch the videos on the Audit mode for free , you can check the assignments and solve them but you can't submit them . If you're already know deeplearning and have done a few projects it wouldn't help that much (IMO) . I myself have already gotten a few courses, so these materials aren't new to me at all .

    The bottom line everything from Andrew Ng is worth watching

    [โ€“] [D]How can CEC prevent Lstm from the problem of vanishing gradients ? finallyifoundvalidUN 1 points ago in MachineLearning

    Well it's not homework , I'm a highschool student and right now I'm trying to teach myself how it works

    Thanks anyway

    [โ€“] [D]How can CEC prevent Lstm from the problem of vanishing gradients ? finallyifoundvalidUN 1 points ago in MachineLearning

    Right but what I don't understand is how it does not vanish due to the other activation functions?

    The input,output and forget gates use a sigmoid, which derivative is at most 0.25 ,I'm wondering how does backpropagating through those not make the gradient vanish ?