Publicacions CVC
Home
|
Show All
|
Simple Search
|
Advanced Search
|
Add Record
|
Import
You must login to submit this form!
Login
Quick Search:
Field:
main fields
author
title
publication
keywords
abstract
created_date
call_number
contains:
...
Edit the following record:
Author
...
is Editor
Title
...
Type
Journal Article
Abstract
Book Chapter
Book Whole
Conference Article
Conference Volume
Journal
Magazine Article
Manual
Manuscript
Map
Miscellaneous
Newspaper Article
Patent
Report
Software
Year
...
Publication
...
Abbreviated Journal
...
Volume
...
Issue
...
Pages
...
Keywords
...
Abstract
Neural networks are very effective when trained on large datasets for a large number of iterations. However, when they are trained on non-stationary streams of data and in an online fashion, their performance is reduced (1) by the online setup, which limits the availability of data, (2) due to catastrophic forgetting because of the non-stationary nature of the data. Furthermore, several recent works (Caccia et al., 2022; Lange et al., 2023) arXiv:2205.13452 showed that replay methods used in continual learning suffer from the stability gap, encountered when evaluating the model continually (rather than only on task boundaries). In this article, we study the effect of model ensembling as a way to improve performance and stability in online continual learning. We notice that naively ensembling models coming from a variety of training tasks increases the performance in online continual learning considerably. Starting from this observation, and drawing inspirations from semi-supervised learning ensembling methods, we use a lightweight temporal ensemble that computes the exponential moving average of the weights (EMA) at test time, and show that it can drastically increase the performance and stability when used in combination with several methods from the literature.
Address
...
Corporate Author
...
Thesis
Bachelor's thesis
Master's thesis
Ph.D. thesis
Diploma thesis
Doctoral thesis
Habilitation thesis
Publisher
...
Place of Publication
...
Editor
...
Language
...
Summary Language
...
Original Title
...
Series Editor
...
Series Title
...
Abbreviated Series Title
...
Series Volume
...
Series Issue
...
Edition
...
ISSN
...
ISBN
...
Medium
...
Area
...
Expedition
...
Conference
...
Notes
...
Approved
yes
no
Location
Call Number
...
Serial
Marked
yes
no
Copy
true
fetch
ordered
false
Selected
yes
no
User Keys
...
User Notes
...
User File
...
User Groups
...
Cite Key
...
Related
...
File
URL
...
DOI
...
Online publication. Cite with this text:
...
Location Field:
don't touch
add
remove
my name & email address
Home
SQL Search
|
Library Search
|
Show Record
|
Extract Citations
Help