Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Chapter

Download BibTeX

Title

A case where a spindly two-layer linear network decisively outperforms any neural network with a fully connected input layer

Authors

[ 1 ] Instytut Informatyki, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.3] Information and communication technology

Year of publication

2021

Chapter type

chapter in monograph / paper

Publication language

english

Keywords
EN
  • neural networks
  • scale invariance
  • spindly network
  • gradient descent
  • lower bounds
Pages (from - to)

1214 - 1236

URL

http://proceedings.mlr.press/v132/warmuth21a.html

Book

Proceedings of the 32nd International Conference on Algorithmic Learning Theory

Presented on

32nd International Conference on Algorithmic Learning Theory (ALT 2021), 16-19.03.2021

Open Access Mode

publisher's website

Open Access Text Version

final published version

Date of Open Access to the publication

at the time of publication

Ministry points / chapter

5

Ministry points / conference (CORE)

70

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.