vasilii-feofanov GPaolo commited on
Commit
a478593
1 Parent(s): d7e55f8

Update README.md (#1)

Browse files

- Update README.md (9ac1dad27eeeae87987c6605d4ea304eaf2c77f6)


Co-authored-by: Giuseppe Paolo <GPaolo@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -11,7 +11,12 @@ pinned: false
11
 
12
  ## Projects
13
 
 
 
14
  - [Large Language Models as Markov Chains](https://huggingface.co/papers/2410.02724): theoretical insights on their generalization and convergence properties.
 
 
 
15
  - *(NeurIPS'24)* [MANO: Unsupervised Accuracy Estimation Under Distribution Shifts](https://huggingface.co/papers/2405.18979): when logits are enough to estimate generalization of a pre-trained model.
16
  - *(NeurIPS'24, **Spotlight**)* [Analysing Multi-Task Regression via Random Matrix Theory](https://arxiv.org/pdf/2406.10327): insights on a classical approach and its potentiality for time series forecasting.
17
  - *(ICML'24, **Oral**)* [SAMformer: Unlocking the Potential of Transformers in Time Series Forecasting](https://huggingface.co/papers/2402.10198): sharpness-aware minimization and channel-wise attention is all you need.
 
11
 
12
  ## Projects
13
 
14
+ ### Preprints
15
+
16
  - [Large Language Models as Markov Chains](https://huggingface.co/papers/2410.02724): theoretical insights on their generalization and convergence properties.
17
+
18
+ ### 2024
19
+
20
  - *(NeurIPS'24)* [MANO: Unsupervised Accuracy Estimation Under Distribution Shifts](https://huggingface.co/papers/2405.18979): when logits are enough to estimate generalization of a pre-trained model.
21
  - *(NeurIPS'24, **Spotlight**)* [Analysing Multi-Task Regression via Random Matrix Theory](https://arxiv.org/pdf/2406.10327): insights on a classical approach and its potentiality for time series forecasting.
22
  - *(ICML'24, **Oral**)* [SAMformer: Unlocking the Potential of Transformers in Time Series Forecasting](https://huggingface.co/papers/2402.10198): sharpness-aware minimization and channel-wise attention is all you need.