Replies: 2 comments 1 reply
-
|
I think the data in this project has already been extremely compressed in all aspects. With current performance, unless you’re working on a very small, specific domain and provide your own high-quality dataset, personally I think it would be very hard to train on a personal GPU. That said, I still recommend using a rented GPU platform. Or, for testing, my approach is to cut the full dataset down to 1/10 for verification. On Google Colab with a 16 GB T4 GPU, training this reduced pretrain dataset takes about 20 minutes, and SFT takes about 25 minutes. |
Beta Was this translation helpful? Give feedback.
-
|
Using a rented GPU platform. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
First of all i would like to thankyou for creating this awesome project, and i personally was looking for something like this but when i saw your pc spec you had 2x RTX 3090 :-) which is muchhh better than what i have so i was wondering will it be possible for me to train on my gpu and if yes then how much time do you think it will take and can i reduce time consumption by any other method rather than reducing training data ?
Beta Was this translation helpful? Give feedback.
All reactions