15 lines
925 B
Markdown
15 lines
925 B
Markdown
### 2023
|
|
| Title | Venue | Link |
|
|
| --- |-------|------------------------------------------------------------------|
|
|
| FedPETuning: When Federated Learning Meets the Parameter-Efficient Tuning Methods of Pre-trained Language Models | ACL | [pdf](https://aclanthology.org/2023.findings-acl.632/), [code](https://github.com/SMILELab-FL/FedPETuning) |
|
|
|
|
### 2022
|
|
| Title | Venue | Link |
|
|
|-------|-------|-----------------------------------------|
|
|
| Scaling Language Model Size in Cross-Device Federated Learning | ACL Workshop | [pdf](https://arxiv.org/abs/2204.09715) |
|
|
|
|
### 2021
|
|
| Title | Venue | Link |
|
|
| --- | --- |------------------------------------------|
|
|
| Scaling federated learning for fine-tuning of large language models | NLDB | [pdf](https://arxiv.org/abs/2102.00875) |
|