Please help contribute to the Reddit categorization project here


    + friends - friends
    61,414 link karma
    1,021 comment karma
    send message redditor for

    [–] Creative Adversarial networks , Simplified finallyifoundvalidUN 1 points ago in deeplearning

    Thanks , Done !

    It's not just a raw translation , I added more information

    These you go :

    [–] Creative Adversarial networks , Simplified finallyifoundvalidUN 1 points ago in deeplearning

    Am I allowed to translate your article to my native language?

    [–] [D] GTX 1050ti vs GTX 1060 for Machine Learning Workstation finallyifoundvalidUN 1 points ago in MachineLearning

    Forget about both of them! I first bought a GTX 1050TI but it ran outta memory , go straight for a GTX 1080TI

    [–] [D] How do people come up with all these crazy deep learning architectures? finallyifoundvalidUN 8 points ago in MachineLearning

    I start with a baseline , and measure how good it works , then keep adding more different layers and babysit the training process

    [–] Understanding and implementing CNNs from scratch finallyifoundvalidUN 1 points ago in learnmachinelearning

    I was actually about to implement something similar for my own understanding , I like they way you put it into code . Am I allowed to translate the whole thing into my own native language and share it with people?

    [–] [R] Self-Normalizing Neural Networks finallyifoundvalidUN 2 points ago in MachineLearning


    They have introduced self-normalizing neural networks for which they have proved that neuron activations are pushed towards zero mean and unit variance when propagated through the network.

    Additionally, for activations not close to unit variance, they have proved an upper and lower bound on the variance mapping. Consequently, SNNs do not face vanishing and exploding gradient problems. Therefore, SNNs work well for architectures with many layers, allowed them to introduce a novel regularization scheme, and learn very robustly. On 121 UCI benchmark datasets, SNNs have outperformed other FNNs with and without normalization techniques, such as batch, layer, and weight normalization, or specialized architectures, such as Highway or Residual networks. SNNs also yielded the best results on drug discovery and astronomy tasks. The best performing SNN architectures are typically very deep in contrast to other FNNs.

    Code :

    [–] [R]'Hashing' can eliminate more than 95 percent of computations finallyifoundvalidUN 9 points ago in MachineLearning

    TL;DR: Computer scientists have adapted a widely used technique for rapid data-lookup to slash the amount of computation -- and thus energy and time -- required for 'deep learning.'

    This applies to any deep-learning architecture, and the technique scales sublinearly, which means that the larger the deep neural network to which this is applied, the more the savings in computations there will be," said lead researcher Anshumali Shrivastava, an assistant professor of computer science at Rice.

    The research will be presented in August at the KDD 2017 conference in Halifax, Nova Scotia. It addresses one of the biggest issues facing tech giants like Google, Facebook and Microsoft as they race to build, train and deploy massive deep-learning networks for a growing body of products as diverse as self-driving cars, language translators and intelligent replies to emails

    [–] Shooting incident in Iran parliament, news agencies report finallyifoundvalidUN 11 points ago in worldnews

    Man you've been reading western propaganda for way too long

    You seriously need to rely more on facts and logical thinking rather than emotion spurred by Western propaganda