The latest technology and digital news on the web


Contrary to the allegation of the paper, Lee said that random shuffling  affect the results, and potentially by a lot, when tested on fully-connected networks as against to convolutional neural networks.

Lee also questioned the authority of comparing early-pruning methods to post-training consequence pruning. “Magnitude based pruning undergoes  before it starts the pruning process, admitting pruning-at-initialization methods do not (by definition),” Lee said. “This indicates that they are not continuing at the same start line—the former is far ahead of others—and therefore, this could intrinsically and unfairly favor the former. In fact, the saliency of consequence is not likely a active force that yields good achievement for consequence based pruning; it’s rather the algorithm (e.g., how long it trains first, how much it prunes, etc.) that is well-tuned.”

Lee added that if magnitude-based pruning starts at the same stage as with pruning-at-initialization methods, it will be the same as random pruning because the antecedent weights of neural networks are random values.

Making deep acquirements analysis more accessible

It would be absorbing to see how analysis in this area unfolds. I’m also analytical to see how these and future methods would accomplish on other neural arrangement architectures such as Transformers, which are by far more computationally big-ticket to train than CNNs. Also worth noting is that these methods have been developed for and tested on supervised acquirements problems. Hopefully, we’ll see agnate analysis on agnate techniques for more costly branches of AI such as deep reinforcement learning.

Progress in this field could have a huge impact on the future of AI analysis and applications. With the costs of training deep neural networks consistently growing, some parts of areas of analysis are acceptable more centralized in affluent tech companies who have vast banking and computational resources.

Effective ways to prune neural networks before training them could create new opportunities for a wider group of AI advisers and labs who don’t have access to very large computational resources.

This commodity was originally appear by Ben Dickson on TechTalks, a advertisement that examines trends in technology, how they affect the way we live and do business, and the problems they solve. But we also altercate the evil side of technology, the darker implications of new tech and what we need to look out for. You can read the aboriginal commodity here. 

Appear October 18, 2020 — 09:00 UTC

Hottest related news