Usage
Local inference
How to use g4f to run language models locally
Required dependencies
make sure to install the required dependencies by running
or
Basic usage example
Upon first use, there will be a prompt asking you if you wish to download the model. If you respond with y
, g4f will go ahead and download the model for you.
You can also manually place supported models into ./g4f/local/models/
You can get a list of the current supported models by running:
This was just first released as of 11 March 2024, and will recieve further updates soon.
Was this page helpful?