mirror of
https://github.com/osmarks/meme-search-engine.git
synced 2024-11-10 22:09:54 +00:00
forgot the README
This commit is contained in:
parent
46fca3eb7f
commit
20fcc9317f
@ -15,13 +15,17 @@ They say a picture is worth a thousand words. Unfortunately, many (most?) sets o
|
|||||||
|
|
||||||
## Setup
|
## Setup
|
||||||
|
|
||||||
|
This is untested. It might work.
|
||||||
|
|
||||||
* Serve your meme library from a static webserver.
|
* Serve your meme library from a static webserver.
|
||||||
* I use nginx. If you're in a hurry, you can use `python -m http.server`.
|
* I use nginx. If you're in a hurry, you can use `python -m http.server`.
|
||||||
* Install Python dependencies with `pip` from `requirements.txt` (the versions probably shouldn't need to match exactly if you need to change them; I just put in what I currently have installed).
|
* Install Python dependencies with `pip` from `requirements.txt` (the versions probably shouldn't need to match exactly if you need to change them; I just put in what I currently have installed).
|
||||||
|
* You now need a [patched version](https://github.com/osmarks/transformers-patch-siglip) of `transformers` due to SigLIP support.
|
||||||
|
* I have converted exactly one SigLIP model: [https://huggingface.co/gollark/siglip-so400m-14-384](https://huggingface.co/gollark/siglip-so400m-14-384). It's apparently the best one. If you don't like it, find out how to convert more. You need to download that repo.
|
||||||
* Run `clip_server.py` (as a background service).
|
* Run `clip_server.py` (as a background service).
|
||||||
* It is configured with a JSON file given to it as its first argument. An example is in `clip_server_config.json`.
|
* It is configured with a JSON file given to it as its first argument. An example is in `clip_server_config.json`.
|
||||||
* `device` should probably be `cuda` or `cpu`. The model will run on here.
|
* `device` should probably be `cuda` or `cpu`. The model will run on here.
|
||||||
* `model` is the [OpenCLIP](https://github.com/mlfoundations/open_clip) model to use.
|
* `model` is ~~the [OpenCLIP](https://github.com/mlfoundations/open_clip) model to use~~ the path to the SigLIP model repository.
|
||||||
* `model_name` is the name of the model for metrics purposes.
|
* `model_name` is the name of the model for metrics purposes.
|
||||||
* `max_batch_size` controls the maximum allowed batch size. Higher values generally result in somewhat better performance (the bottleneck in most cases is elsewhere right now though) at the cost of higher VRAM use.
|
* `max_batch_size` controls the maximum allowed batch size. Higher values generally result in somewhat better performance (the bottleneck in most cases is elsewhere right now though) at the cost of higher VRAM use.
|
||||||
* `port` is the port to run the HTTP server on.
|
* `port` is the port to run the HTTP server on.
|
||||||
|
Loading…
Reference in New Issue
Block a user