Quanta Magazine: Boris Knyazev of the University of Guelph in Ontario and his colleagues have designed and trained a “hypernetwork” — a kind of overlord of other neural networks — that could speed up the training process. Given a new, untrained deep neural network designed for some task, the hypernetwork predicts the parameters for the new network in fractions of a second, and in theory could make training unnecessary. Because the hypernetwork learns the extremely complex patterns in the designs of deep neural networks, the work may also have deeper theoretical implications.
top of page
Search
Recent Posts
See AllMSN: 'Russian energy giant Gazprom says it has suspended gas supplies to Latvia - the latest EU country to experience such action amid...
1326
JCCF: 'The Justice Centre regrets to announce that the Ontario Superior Court of Justice has postponed the constitutional challenge to...
1727
TPM: 'Compared to respondents over 55 where 63 percent said that they believe in abortion whenever wanted, just 50 percent in the lowest...
754
bottom of page
Comments