"If a worker wants to do his job well, he must first sharpen his tools." - Confucius, "The Analects of Confucius. Lu Linggong"
Front page > Programming > Huge Daily Developments for FLUX LoRA Training (Now Even Works on GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on GPU) and More

Published on 2024-11-04
Browse:394

Joycaption now has both multi GPU support and batch size support > https://www.patreon.com/posts/110613301

FLUX LoRA training configurations fully updated and now works as low as 8GB GPUs — yes you can train on 8 GB GPU a 12 billion parameter model — very good speed and quality > https://www.patreon.com/posts/110293257

Check out the images to see all details

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Huge Daily Developments for FLUX LoRA Training (Now Even Works on  GPU) and More

Release Statement This article is reproduced at: https://dev.to/furkangozukara/huge-daily-developments-for-flux-lora-training-now-even-works-on-8gb-gpu-and-more-5802?1 Any infringement , please contact [email protected] to delete
Latest tutorial More>

Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.

Copyright© 2022 湘ICP备2022001581号-3