Replies: 2 comments 2 replies
-
|
wagtail-localize has support for several machine translators already:
each of these is an implementation of All 3 of these implement the I'd suggest having a look over https://github.com/wagtail/wagtail-localize/tree/bb04c7627701c7922be66389a1aaa2a721a09559/wagtail_localize/machine_translators Sounds like what you want to do could be achieved by making an implementation of |
Beta Was this translation helpful? Give feedback.
-
|
To add to what @chris48s shared, I'd suggest making that its own module/package as you can iterate faster when it is on its own. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm wondering; is anyone interested / would it be possible in this project to integrate local LLM backends? (Of course I will publish my code with tests)
These might be useful for developers wanting to translate locally; without cost. (There are some really low resource models out there!)
Allows for more leniency - but the drawback, most translation models I could find which are reasonably performant do not support HTML translation.
I currently in my own project have the following block which only allows for the local AI backend if
transformersis installed - wondering if this would be the recommended way of implementing it.Beta Was this translation helpful? Give feedback.
All reactions