$ docker-compose up -d & docker-compose logs -f weaviateĪlternatively you can run docker-compose entirely detached with docker-compose up -d and then poll /v1/meta until you receive a status 200 OK. You can attach the logs only to Weaviate itself, for example, by running the following command instead of docker-compose up: # Run Docker Compose The output of docker-compose up is quite verbose as it attaches to the logs of all containers. from an outside ML model)Īt import and search time: version: '3.4'Īttaching to the log output of Weaviate only You will need to provide your own vectors (e.g. In this case, no model inference is performed at either import or search Weaviate without any modulesĪn example docker-compose setup for Weaviate without any modules can be foundīelow. This means that you can either enable the text2vec-contextionary, the text2vec-transformers or no text vectorization module. Note: at the moment, text vectorization modules cannot be combined in a single setup. The text2vec-contextionary module is designed to run with CPU hardware andĭoes not require or benefit from GPU-accelerated hardware. Image: semitechnologies/contextionary:en0.16.0-v1.0.2 PERSISTENCE_DATA_PATH: '/var/lib/weaviate'ĭEFAULT_VECTORIZER_MODULE: 'text2vec-contextionary'ĮXTENSIONS_STORAGE_ORIGIN: NEIGHBOR_OCCURRENCE_IGNORE_PERCENTILE: 5 Weaviate with the text2vec-contextionary modelĪn example docker-compose setup file with the english language contextionary model is: version: '3.4' The text2vec-transformers module requires at least Weaviate version v1.2.0. Enable CUDA if you have a GPU availableįor more information on how to set up the environment with the Running Weaviate with the text2vec-transformers module and without GPU is Note that transformer models are Neural Networks built to run on # NVIDIA_VISIBLE_DEVICES: all # enable if running with CUDA Image: semitechnologies/transformers-inference:sentence-transformers-msmarco-distilroberta-base-v2 TRANSFORMERS_INFERENCE_API: CLUSTER_HOSTNAME: 'node1' Weaviate with the text2vec-transformers modelĪn example docker-compose setup file with the transformers model sentence-transformers/msmarco-distilroberta-base-v2 is: version: '3.4'ĪUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'ĭEFAULT_VECTORIZER_MODULE: text2vec-transformers To run any of the below examples, save one of the snippets asĭocker-compose.yml and start them by running docker-compose up from within You can also use one of the following three example docker-compose.yml files If you do not with to use the configuration tool above to customize your setup, Additional environment variables can be set in this file, which regulate your Weaviate setup, authentication and authorization, module settings, and data storage settings. You can obtain it from the configuration tool above or alternatively pick one of the examples below. To start Weaviate with docker-compose, you need a docker-compose configuration file. If you are new to Docker (Compose) and containerization, check out our Docker Introduction for Weaviate Users. If you want to try out Weaviate locally and on a small scale, you can use Docker Compose.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |