PaperPulse logo
FeedTopicsAI Researcher FeedBlogPodcastAccount

Stay Updated

Get the latest research delivered to your inbox

Platform

  • Home
  • About Us
  • Search Papers
  • Research Topics
  • Researcher Feed

Resources

  • Newsletter
  • Blog
  • Podcast
PaperPulse•

AI-powered research discovery platform

© 2024 PaperPulse. All rights reserved.

FLoRG: Federated Fine-tuning with Low-rank Gram Matrices and Procrustes Alignment

ArXivSource

Chuiyang Meng, Ming Tang, Vincent W. S. Wong

cs.LG
cs.AI
|
Feb 19, 2026
4 views

One-line Summary

FLoRG improves federated fine-tuning of large language models by using a single low-rank matrix and Procrustes alignment, enhancing accuracy and reducing communication overhead.

Plain-language Overview

FLoRG is a new method for improving how large language models (LLMs) are fine-tuned across different devices without sharing sensitive data. It simplifies the process by using one low-rank matrix instead of two, which helps avoid errors and reduces the amount of data that needs to be communicated. Additionally, it uses a technique called Procrustes alignment to ensure consistent updates, leading to better performance. Tests show that FLoRG is more accurate and efficient compared to other current methods.

Technical Details