Skip to content

Add or Replace Documents

Add an array of documents or replace them if they already exist.

If you send a document with an _id that corresponds to an existing document, the new document will overwrite the existing document.

This endpoint accepts the application/json content type.


POST /indexes/{index_name}/documents

Path parameters

Name Type Description
index_name String name of the index

Query parameters

Query Parameter Type Default Value Description
device String null The device used to index the documents. If device is not specified and CUDA devices are available to Marqo (see here for more info), Marqo will speed up the indexing process by using available CUDA devices. Otherwise, the CPU will be used. Options include cpu and cuda, cuda1, cuda2 etc. The cuda option tells Marqo to use any available cuda devices.
telemetry Boolean False If true, the telemetry object is returned in the add documents response body. This includes information like latency metrics. This is set at client instantiation time in the Python client: mq = marqo.Client(return_telemetry=True)

Body

In the RestAPI and for curl users these parameters are in lowerCamelCase, as presented in the following table. The Python client uses the pythonic snake_case equivalents.

Add documents parameters Value Type Default Value Description
documents Array of objects n/a An array of documents. Each document is represented as a JSON object. You can optionally set a document's ID with the special _id field. The _id must be a string type. If an ID is not specified, marqo will generate one.
tensorFields
Array of Strings [] Structured indexes only support tensor fields at the time of their creation, while unstructured indexes can include these fields during the document-adding process. The fields within these documents which will be tensor fields, and therefore will have vectors generated for them. Tensor search can only be performed on these fields for these documents. Pre-filtering and lexical search are still viable on text fields which are not included in the tensorFields parameter. For the best recall and speed performance, we recommend minimising the number of different tensor fields for your index. For production use cases where speed and recall are critical, we recommend only a single tensor field for the entire index.
useExistingTensors Boolean false Setting this to true will get existing tensors for unchanged fields in documents that are indexed with an id. Note: Marqo analyses the field string for updates, so Marqo can't detect a change if a URL points to a different image.
imageDownloadHeaders (deprecated) Dict null An object that consists of key-value pair headers for image download. Can be used to authenticate the images for download.
mediaDownloadHeaders Dict null An object that consists of key-value pair headers for media download. Can be used to authenticate the all types of media for download.
mappings Dict null An object to handle object fields in documents. Check mappings for more information. Mappings are required to create multimodal combination and custom vector fields - see here for more information
modelAuth Dict null An object that consists of authorisation details used by Marqo to download non-publicly available models. Check here for more information.
clientBatchSize Integer null A Python client only helper parameter that splits up very large lists of documents into batches of a more manageable size for Marqo.
textChunkPrefix String null The prefix added to indexed text document chunks when embedding. Setting this field overrides the textChunkPrefix set in the index settings during index creation. If it unset by the user, it defaults to the prefixes defined in the index settings. For more information on default values for index settings, see create_index.

Additional Marqo Cloud Body Parameters

Marqo Cloud creates dedicated infrastructure for each index. Using the create index endpoint, you can specify the type of storage for the index storageClass and the type of inference inferenceType. The number of storage instances is defined by numberOfShards, the number of replicas numberOfReplicas and the number of Marqo inference nodes by numberOfInferences. This is only supported for Marqo Cloud, not Marqo Open Source.

Name Type Default value Description Open Source Cloud
inferenceType String marqo.CPU.small Type of inference for the index. Options are "marqo.CPU.small"(deprecated), "marqo.CPU.large", "marqo.GPU".
storageClass String marqo.basic Type of storage for the index. Options are "marqo.basic", "marqo.balanced", "marqo.performance".
numberOfShards Integer 1 The number of shards for the index.
numberOfReplicas Integer 0 The number of replicas for the index.
numberOfInferences Integer 1 The number of inference nodes for the index.

Response

The response of the add_or_replace_documents endpoint in Marqo operates on two levels. Firstly, a status code of 200 in the overall response indicates that the batch request has been successfully received and processed by Marqo. The response has the following fields:

Field Name Type Description
errors Boolean Indicates whether any errors occurred during the processing of the batch request.
items Array An array of objects, each representing the processing status of an individual document in the batch.
processingTimeMs Integer The time taken to process the batch request, in milliseconds.
index_name String The name of the index to which the documents were added.

However, a 200 status does not necessarily imply that each individual document within the batch was processed without issues. For each document in the batch, there will be an associated response code that specifies the status of that particular document's processing. These individual response codes provide granular feedback, allowing users to discern which documents were successfully processed, which encountered errors, and the nature of any issues encountered.

Each item in the items array has the following fields:

Field Name Type Description
_id String The ID of the document that was processed.
status Integer The status code of the document processing.
message String A message that provides additional information about the processing status of the document. This field only exists when the status is not 200.

Here is the HTTP status code of the individual document responses (non-exhaustive list of status codes):

Status Code Description
200 The document is successfully added to the index.
400 Bad request. Returned for invalid input (e.g., invalid field types). Inspect message for details.
429 Marqo vector store receives too many requests. Please try again later.
500 Internal error.

Example

For unstructured index:

curl -XPOST 'http://localhost:8882/indexes/my-first-index/documents' \
-H 'Content-type:application/json' -d '
{
"documents": [ 
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
        }, 
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }
],
"tensorFields": ["Description"]
}'
mq.index("my-first-index").add_documents([
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
    },
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }],
    tensor_fields=["Description"]
)

curl -XPOST 'your_endpoint/indexes/my-first-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-type:application/json' -d '
{
"documents": [ 
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
        }, 
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }
],
"tensorFields": ["Description"]
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

mq.index("my-first-index").add_documents([
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
    },
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }],
    tensor_fields=["Description"]
)

For structured index:

curl -XPOST 'http://localhost:8882/indexes/my-first-structured-index/documents' \
-H 'Content-type:application/json' -d '
{
"documents": [ 
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
        }, 
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }
]
}'
mq.index("my-first-structured-index").add_documents([
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
    },
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }]
)

curl -XPOST 'your_endpoint/indexes/my-first-structured-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-type:application/json' -d '
{
    "documents": [
        {
            "text_field": "The Travels of Marco Polo",
            "caption": "A 13th-century travelogue describing the travels of Polo",
            "tags": ["History"]
        },
        {
            "text_field": "Extravehicular Mobility Unit (EMU)",
            "caption": "The EMU is a spacesuit that provides environmental protection",
            "_id": "article_591",
            "tags": ["Science"]
        }
    ]
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

mq.index("my-first-structured-index").add_documents([
    {
        "Title": "The Travels of Marco Polo",
        "Description": "A 13th-century travelogue describing the travels of Polo",
        "Genre": "History"
    },
    {
        "Title": "Extravehicular Mobility Unit (EMU)",
        "Description": "The EMU is a spacesuit that provides environmental protection",
        "_id": "article_591",
        "Genre": "Science"
    }]
)

Response: 200 OK

{
  "errors": false,
  "items": [
    {
      "_id": "5aed93eb-3878-4f12-bc92-0fda01c7d23d",
      "status": 200
    },
    {
      "_id": "article_591",
      "status": 200
    }
  ],
  "processingTimeMs": 6,
  "index_name": "my-first-index"
}

The first document in this example had its _id generated by Marqo. In this example, there was already a document in Marqo with _id = article_591, so it was updated rather than created. In unstructured index we want Description to be a searchable with tensor search (Marqo's default search), so we explicitly declare it as a tensor field. In structured index the tensor fields are specified during index creation, so we don't need to specify them here. Tensor fields are stored alongside vector representation of the data, allowing for multimodal and semantic searches.

If you would like to see an example of adding video and audio documents, please visit this section.

Documents

Parameter: documents

Expected value: An array of documents (default maximum length: 128). Each document is a JSON object that is to be added to the index. Each key is the name of a document's field and its value is the content for that field. See here for the allowed field data types. The optional _id key can be used to specify a string as the document's ID.

Map Fields

Only flat numeric dictionaries with int, long, float, and double values are currently supported as document fields.

[
  {
    "Title": "The Travels of Marco Polo",
    "Description": "A 13th-century travelogue describing Polo's travels",
  },
  {
    "Title": "Extravehicular Mobility Unit (EMU)",
    "Description": "The EMU is a spacesuit that provides environmental protection",
    "_id": "article_591"
  },
  {
    "Title": "The Travels of Marco Polo",
    "Description": "A 13th-century travelogue describing Polo's travels",
    "map_numeric_field": {
        "popularity": 56.4,
        "availability": 0.9,
        "year_published": 1300,
    }
  },
]

Mappings

Parameter: mappings

Expected value: JSON object with field names as keys, mapped to objects with type (currently only multimodal_combination and custom_vector are supported). Multimodal combination mappings also have weights, which is an object that maps each nested field to a relative weight.

Default value: null

The mappings object allows adding special fields, such as: multimodal fields, custom vector fields, and map score modifiers.

With multimodal fields, child fields are vectorised and combined into a single tensor via weighted-sum approach using the weights object. The combined tensor will be used for tensor search.

With custom vector fields, vectors can be directly inserted into documents. This is useful if you are generating your vectors outside of marqo.

With map score modifiers, child fields values can be of type int, long, float, or double. These values will be used in the score modifier computation during search.

All multimodal combination or custom vector fields must be in tensor_fields.

Dependent fields can be used for lexical search or vector search with filtering. Dependent fields can only have content of type str, representing a text or a pointer (URL) to an image.

The mappings is optional with structured indexes and is only needed if the user needs to override default multimodal weights defined at index creation time. Additionally, custom vector fields should not be declared in mappings for structured indexes.

Read more about using mappings and special fields here

Example: Multimodal Combination

Unstructured Index (Default)

# Create an unstructured index (default)
curl -X POST 'http://localhost:8882/indexes/my-first-index' \
-H "Content-Type: application/json" \
-d '{
    "treatUrlsAndPointersAsImages": true,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k"
}'

# Add documents with mappings to specify multimodal combination fields
curl -X POST 'http://localhost:8882/indexes/my-first-index/documents' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse"
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky"
        }
    ],
    "mappings": {
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {
                "img": 0.9, "caption": 0.1
            }
        }
    },
    "tensorFields": ["my_combination_field"]
}'
# Create an unstructured index (default)
mq.create_index(
    "my-first-index",
    treat_urls_and_pointers_as_images=True,
    model="open_clip/ViT-B-32/laion2b_s34b_b79k",
)

# Add documents with mappings to specify multimodal combination fields
mq.index("my-first-index").add_documents(
    [
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse",
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky",
        },
    ],
    mappings={
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {"img": 0.9, "caption": 0.1},
        }
    },
    # multimodal combination fields must be in tensor_fields
    tensor_fields=["my_combination_field"],
)

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

# Create an unstructured index (default)
curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-first-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "treatUrlsAndPointersAsImages": true,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k"
}'

# Add documents with mappings to specify multimodal combination fields
curl -X POST 'your_endpoint/indexes/my-first-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse"
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky"
        }
    ],
    "mappings": {
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {
                "img": 0.9, "caption": 0.1
            }
        }
    },
    "tensorFields": ["my_combination_field"]
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

# Create an unstructured index (default)
mq.create_index(
    "my-first-index",
    treat_urls_and_pointers_as_images=True,
    model="open_clip/ViT-B-32/laion2b_s34b_b79k",
)

# Add documents with mappings to specify multimodal combination fields
mq.index("my-first-index").add_documents(
    [
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse",
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky",
        },
    ],
    mappings={
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {"img": 0.9, "caption": 0.1},
        }
    },
    # multimodal combination fields must be in tensor_fields
    tensor_fields=["my_combination_field"],
)

Structured Index

# Alternatively you can create a structured index with multimodal combination fields
curl -X POST 'http://localhost:8882/indexes/my-first-structured-index' \
-H "Content-Type: application/json" \
-d '{
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "type": "structured",
    "allFields": [
        {"name": "caption", "type": "text"}, 
        {"name": "img", "type": "image_pointer"},
        {"name": "my_combination_field", "type": "multimodal_combination", 
        "dependentFields": {"caption": 0.5, "img": 0.5}}
    ],
    "tensorFields": ["my_combination_field"]
}'

# Add documents
# The mappings object is optional with structured indexes and is only needed if the user needs to
# override default multimodal weights defined at index creation time.
curl -X POST 'http://localhost:8882/indexes/my-first-structured-index/documents' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse"
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky"
        }
    ],
    "mappings": {
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {
                "img": 0.6, "caption": 0.4
            }
        }
    }
}'
# Alternatively you can create a structured index with multimodal combination fields
mq.create_index(
    "my-first-structured-index",
    type="structured",
    model="open_clip/ViT-B-32/laion2b_s34b_b79k",
    all_fields=[
        {"name": "caption", "type": "text"},
        {"name": "img", "type": "image_pointer"},
        {
            "name": "my_combination_field",
            "type": "multimodal_combination",
            "dependent_fields": {"caption": 0.5, "img": 0.5},
        },
    ],
    tensor_fields=["my_combination_field"],
)

# Add documents
mq.index("my-first-structured-index").add_documents(
    [
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse",
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky",
        },
    ],
    # The mappings object is optional with structured indexes and is only needed if the user needs to
    # override default multimodal weights defined at index creation time.
    mappings={
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {"img": 0.6, "caption": 0.4},
        }
    },
)

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

# Alternatively you can create a structured index with multimodal combination fields
curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-first-structured-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "type": "structured",
    "allFields": [
        {"name": "caption", "type": "text"}, 
        {"name": "img", "type": "image_pointer"},
        {"name": "my_combination_field", "type": "multimodal_combination", 
        "dependentFields": {"caption": 0.5, "img": 0.5}}
    ],
    "tensorFields": ["my_combination_field"]
}'

# Add documents
# The mappings object is optional with structured indexes and is only needed if the user needs to
# override default multimodal weights defined at index creation time.
curl -X POST 'your_endpoint/indexes/my-first-structured-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse"
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky"
        }
    ],
    "mappings": {
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {
                "img": 0.6, "caption": 0.4
            }
        }
    }
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

# Alternatively you can create a structured index with multimodal combination fields
mq.create_index(
    "my-first-structured-index",
    type="structured",
    model="open_clip/ViT-B-32/laion2b_s34b_b79k",
    all_fields=[
        {"name": "caption", "type": "text"},
        {"name": "img", "type": "image_pointer"},
        {
            "name": "my_combination_field",
            "type": "multimodal_combination",
            "dependent_fields": {"caption": 0.5, "img": 0.5},
        },
    ],
    tensor_fields=["my_combination_field"],
)

# Add documents
mq.index("my-first-structured-index").add_documents(
    [
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image1.jpg",
            "caption": "A man riding horse",
        },
        {
            "img": "https://raw.githubusercontent.com/marqo-ai/marqo/mainline/examples/ImageSearchGuide/data/image2.jpg",
            "caption": "An airplane flying in the sky",
        },
    ],
    # The mappings object is optional with structured indexes and is only needed if the user needs to
    # override default multimodal weights defined at index creation time.
    mappings={
        "my_combination_field": {
            "type": "multimodal_combination",
            "weights": {"img": 0.6, "caption": 0.4},
        }
    },
)

Example: Custom Vectors

(Replace 'vector' field value with your own vectors!)

Unstructured Index (Default)

# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512). 
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.
curl -X POST 'http://localhost:8882/indexes/my-first-index' \
-H "Content-Type: application/json" \
-d '{
    "treatUrlsAndPointersAsImages": true,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "annParameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16}
    }
}'

# We add the custom vector documents into our index (with mappings)
curl -X POST 'http://localhost:8882/indexes/my-first-index/documents' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "_id": "doc1",
            "my_custom_vector": {
                "vector": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480, 481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 504, 505, 506, 507, 508, 509, 510, 511],
                "content": "Singing audio file"
            }
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                "vector": [1.0, 0.5, 0.3333, 0.25, 0.2, 0.1667, 0.1429, 0.125, 0.1111, 0.1, 0.0909, 0.0833, 0.0769, 0.0714, 0.0667, 0.0625, 0.0588, 0.0556, 0.0526, 0.05, 0.0476, 0.0455, 0.0435, 0.0417, 0.04, 0.0385, 0.037, 0.0357, 0.0345, 0.0333, 0.0323, 0.0312, 0.0303, 0.0294, 0.0286, 0.0278, 0.027, 0.0263, 0.0256, 0.025, 0.0244, 0.0238, 0.0233, 0.0227, 0.0222, 0.0217, 0.0213, 0.0208, 0.0204, 0.02, 0.0196, 0.0192, 0.0189, 0.0185, 0.0182, 0.0179, 0.0175, 0.0172, 0.0169, 0.0167, 0.0164, 0.0161, 0.0159, 0.0156, 0.0154, 0.0152, 0.0149, 0.0147, 0.0145, 0.0143, 0.0141, 0.0139, 0.0137, 0.0135, 0.0133, 0.0132, 0.013, 0.0128, 0.0127, 0.0125, 0.0123, 0.0122, 0.012, 0.0119, 0.0118, 0.0116, 0.0115, 0.0114, 0.0112, 0.0111, 0.011, 0.0109, 0.0108, 0.0106, 0.0105, 0.0104, 0.0103, 0.0102, 0.0101, 0.01, 0.0099, 0.0098, 0.0097, 0.0096, 0.0095, 0.0094, 0.0093, 0.0093, 0.0092, 0.0091, 0.009, 0.0089, 0.0088, 0.0088, 0.0087, 0.0086, 0.0085, 0.0085, 0.0084, 0.0083, 0.0083, 0.0082, 0.0081, 0.0081, 0.008, 0.0079, 0.0079, 0.0078, 0.0078, 0.0077, 0.0076, 0.0076, 0.0075, 0.0075, 0.0074, 0.0074, 0.0073, 0.0072, 0.0072, 0.0071, 0.0071, 0.007, 0.007, 0.0069, 0.0069, 0.0068, 0.0068, 0.0068, 0.0067, 0.0067, 0.0066, 0.0066, 0.0065, 0.0065, 0.0065, 0.0064, 0.0064, 0.0063, 0.0063, 0.0063, 0.0062, 0.0062, 0.0061, 0.0061, 0.0061, 0.006, 0.006, 0.006, 0.0059, 0.0059, 0.0058, 0.0058, 0.0058, 0.0057, 0.0057, 0.0057, 0.0056, 0.0056, 0.0056, 0.0056, 0.0055, 0.0055, 0.0055, 0.0054, 0.0054, 0.0054, 0.0053, 0.0053, 0.0053, 0.0053, 0.0052, 0.0052, 0.0052, 0.0052, 0.0051, 0.0051, 0.0051, 0.0051, 0.005, 0.005, 0.005, 0.005, 0.0049, 0.0049, 0.0049, 0.0049, 0.0048, 0.0048, 0.0048, 0.0048, 0.0047, 0.0047, 0.0047, 0.0047, 0.0047, 0.0046, 0.0046, 0.0046, 0.0046, 0.0045, 0.0045, 0.0045, 0.0045, 0.0045, 0.0044, 0.0044, 0.0044, 0.0044, 0.0044, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0042, 0.0042, 0.0042, 0.0042, 0.0042, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002],
                "content": "Podcast audio file"
            }
        }
    ],
    "mappings": {
        "my_custom_vector": {
            "type": "custom_vector"
        }
    },
    "tensorFields": ["my_custom_vector"]
}'
# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512).
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

settings = {
    "treat_urls_and_pointers_as_images": True,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "ann_parameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("my-first-index", **settings)

# Random vectors for example purposes. replace these with your own.
example_vector_1 = [i for i in range(512)]
example_vector_2 = [1 / (i + 1) for i in range(512)]

# We add the custom vector documents into our index (with mappings)
res = mq.index("my-first-index").add_documents(
    documents=[
        {
            "_id": "doc1",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_1,
                "content": "Singing audio file",
            },
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_2,
                "content": "Podcast audio file",
            },
        },
    ],
    mappings={"my_custom_vector": {"type": "custom_vector"}},
    tensor_fields=["my_custom_vector"],
)

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512). 
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.
curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-first-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "treatUrlsAndPointersAsImages": true,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "annParameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16}
    }
}'

# We add the custom vector documents into our index (with mappings)
curl -X POST 'your_endpoint/indexes/my-first-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "_id": "doc1",
            "my_custom_vector": {
                "vector": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480, 481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 504, 505, 506, 507, 508, 509, 510, 511],
                "content": "Singing audio file"
            }
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                "vector": [1.0, 0.5, 0.3333, 0.25, 0.2, 0.1667, 0.1429, 0.125, 0.1111, 0.1, 0.0909, 0.0833, 0.0769, 0.0714, 0.0667, 0.0625, 0.0588, 0.0556, 0.0526, 0.05, 0.0476, 0.0455, 0.0435, 0.0417, 0.04, 0.0385, 0.037, 0.0357, 0.0345, 0.0333, 0.0323, 0.0312, 0.0303, 0.0294, 0.0286, 0.0278, 0.027, 0.0263, 0.0256, 0.025, 0.0244, 0.0238, 0.0233, 0.0227, 0.0222, 0.0217, 0.0213, 0.0208, 0.0204, 0.02, 0.0196, 0.0192, 0.0189, 0.0185, 0.0182, 0.0179, 0.0175, 0.0172, 0.0169, 0.0167, 0.0164, 0.0161, 0.0159, 0.0156, 0.0154, 0.0152, 0.0149, 0.0147, 0.0145, 0.0143, 0.0141, 0.0139, 0.0137, 0.0135, 0.0133, 0.0132, 0.013, 0.0128, 0.0127, 0.0125, 0.0123, 0.0122, 0.012, 0.0119, 0.0118, 0.0116, 0.0115, 0.0114, 0.0112, 0.0111, 0.011, 0.0109, 0.0108, 0.0106, 0.0105, 0.0104, 0.0103, 0.0102, 0.0101, 0.01, 0.0099, 0.0098, 0.0097, 0.0096, 0.0095, 0.0094, 0.0093, 0.0093, 0.0092, 0.0091, 0.009, 0.0089, 0.0088, 0.0088, 0.0087, 0.0086, 0.0085, 0.0085, 0.0084, 0.0083, 0.0083, 0.0082, 0.0081, 0.0081, 0.008, 0.0079, 0.0079, 0.0078, 0.0078, 0.0077, 0.0076, 0.0076, 0.0075, 0.0075, 0.0074, 0.0074, 0.0073, 0.0072, 0.0072, 0.0071, 0.0071, 0.007, 0.007, 0.0069, 0.0069, 0.0068, 0.0068, 0.0068, 0.0067, 0.0067, 0.0066, 0.0066, 0.0065, 0.0065, 0.0065, 0.0064, 0.0064, 0.0063, 0.0063, 0.0063, 0.0062, 0.0062, 0.0061, 0.0061, 0.0061, 0.006, 0.006, 0.006, 0.0059, 0.0059, 0.0058, 0.0058, 0.0058, 0.0057, 0.0057, 0.0057, 0.0056, 0.0056, 0.0056, 0.0056, 0.0055, 0.0055, 0.0055, 0.0054, 0.0054, 0.0054, 0.0053, 0.0053, 0.0053, 0.0053, 0.0052, 0.0052, 0.0052, 0.0052, 0.0051, 0.0051, 0.0051, 0.0051, 0.005, 0.005, 0.005, 0.005, 0.0049, 0.0049, 0.0049, 0.0049, 0.0048, 0.0048, 0.0048, 0.0048, 0.0047, 0.0047, 0.0047, 0.0047, 0.0047, 0.0046, 0.0046, 0.0046, 0.0046, 0.0045, 0.0045, 0.0045, 0.0045, 0.0045, 0.0044, 0.0044, 0.0044, 0.0044, 0.0044, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0042, 0.0042, 0.0042, 0.0042, 0.0042, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002],
                "content": "Podcast audio file"
            }
        }
    ],
    "mappings": {
        "my_custom_vector": {
            "type": "custom_vector"
        }
    },
    "tensorFields": ["my_custom_vector"]
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512).
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

settings = {
    "treat_urls_and_pointers_as_images": True,
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "ann_parameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("my-first-index", **settings)

# Random vectors for example purposes. replace these with your own.
example_vector_1 = [i for i in range(512)]
example_vector_2 = [1 / (i + 1) for i in range(512)]

# We add the custom vector documents into our index (with mappings)
res = mq.index("my-first-index").add_documents(
    documents=[
        {
            "_id": "doc1",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_1,
                "content": "Singing audio file",
            },
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_2,
                "content": "Podcast audio file",
            },
        },
    ],
    mappings={"my_custom_vector": {"type": "custom_vector"}},
    tensor_fields=["my_custom_vector"],
)

Structured Index

# For structured indexes, the custom vector field should be declared upon index creation (with type `custom_vector`).
# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512). 
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

curl -X POST 'http://localhost:8882/indexes/my-first-structured-index' \
-H "Content-Type: application/json" \
-d '{
    "type": "structured",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "allFields": [
        {"name": "my_custom_vector", "type": "custom_vector"}
    ],
    "tensorFields": ["my_custom_vector"],
    "annParameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16}
    }
}'

# We add the custom vector documents into our structured index.
# We do NOT use mappings for custom vectors here.
curl -X POST 'http://localhost:8882/indexes/my-first-structured-index/documents' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "_id": "doc1",
            "my_custom_vector": {
                "vector": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480, 481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 504, 505, 506, 507, 508, 509, 510, 511],
                "content": "Singing audio file"
            }
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                "vector": [1.0, 0.5, 0.3333, 0.25, 0.2, 0.1667, 0.1429, 0.125, 0.1111, 0.1, 0.0909, 0.0833, 0.0769, 0.0714, 0.0667, 0.0625, 0.0588, 0.0556, 0.0526, 0.05, 0.0476, 0.0455, 0.0435, 0.0417, 0.04, 0.0385, 0.037, 0.0357, 0.0345, 0.0333, 0.0323, 0.0312, 0.0303, 0.0294, 0.0286, 0.0278, 0.027, 0.0263, 0.0256, 0.025, 0.0244, 0.0238, 0.0233, 0.0227, 0.0222, 0.0217, 0.0213, 0.0208, 0.0204, 0.02, 0.0196, 0.0192, 0.0189, 0.0185, 0.0182, 0.0179, 0.0175, 0.0172, 0.0169, 0.0167, 0.0164, 0.0161, 0.0159, 0.0156, 0.0154, 0.0152, 0.0149, 0.0147, 0.0145, 0.0143, 0.0141, 0.0139, 0.0137, 0.0135, 0.0133, 0.0132, 0.013, 0.0128, 0.0127, 0.0125, 0.0123, 0.0122, 0.012, 0.0119, 0.0118, 0.0116, 0.0115, 0.0114, 0.0112, 0.0111, 0.011, 0.0109, 0.0108, 0.0106, 0.0105, 0.0104, 0.0103, 0.0102, 0.0101, 0.01, 0.0099, 0.0098, 0.0097, 0.0096, 0.0095, 0.0094, 0.0093, 0.0093, 0.0092, 0.0091, 0.009, 0.0089, 0.0088, 0.0088, 0.0087, 0.0086, 0.0085, 0.0085, 0.0084, 0.0083, 0.0083, 0.0082, 0.0081, 0.0081, 0.008, 0.0079, 0.0079, 0.0078, 0.0078, 0.0077, 0.0076, 0.0076, 0.0075, 0.0075, 0.0074, 0.0074, 0.0073, 0.0072, 0.0072, 0.0071, 0.0071, 0.007, 0.007, 0.0069, 0.0069, 0.0068, 0.0068, 0.0068, 0.0067, 0.0067, 0.0066, 0.0066, 0.0065, 0.0065, 0.0065, 0.0064, 0.0064, 0.0063, 0.0063, 0.0063, 0.0062, 0.0062, 0.0061, 0.0061, 0.0061, 0.006, 0.006, 0.006, 0.0059, 0.0059, 0.0058, 0.0058, 0.0058, 0.0057, 0.0057, 0.0057, 0.0056, 0.0056, 0.0056, 0.0056, 0.0055, 0.0055, 0.0055, 0.0054, 0.0054, 0.0054, 0.0053, 0.0053, 0.0053, 0.0053, 0.0052, 0.0052, 0.0052, 0.0052, 0.0051, 0.0051, 0.0051, 0.0051, 0.005, 0.005, 0.005, 0.005, 0.0049, 0.0049, 0.0049, 0.0049, 0.0048, 0.0048, 0.0048, 0.0048, 0.0047, 0.0047, 0.0047, 0.0047, 0.0047, 0.0046, 0.0046, 0.0046, 0.0046, 0.0045, 0.0045, 0.0045, 0.0045, 0.0045, 0.0044, 0.0044, 0.0044, 0.0044, 0.0044, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0042, 0.0042, 0.0042, 0.0042, 0.0042, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002],
                "content": "Podcast audio file"
            }
        }
    ]
}'
# For structured indexes, the custom vector field should be declared upon index creation (with type `custom_vector`).
# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512).
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

settings = {
    "type": "structured",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "all_fields": [{"name": "my_custom_vector", "type": "custom_vector"}],
    "tensor_fields": ["my_custom_vector"],
    "ann_parameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("my-first-structured-index", **settings)

# Random vectors for example purposes. replace these with your own.
example_vector_1 = [i for i in range(512)]
example_vector_2 = [1 / (i + 1) for i in range(512)]

# We add the custom vector documents into our structured index.
# We do NOT use mappings for custom vectors here.
res = mq.index("my-first-structured-index").add_documents(
    documents=[
        {
            "_id": "doc1",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_1,
                "content": "Singing audio file",
            },
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_2,
                "content": "Podcast audio file",
            },
        },
    ]
)

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

# For structured indexes, the custom vector field should be declared upon index creation (with type `custom_vector`).
# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512). 
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-first-structured-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "type": "structured",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "allFields": [
        {"name": "my_custom_vector", "type": "custom_vector"}
    ],
    "tensorFields": ["my_custom_vector"],
    "annParameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16}
    }
}'

# We add the custom vector documents into our structured index.
# We do NOT use mappings for custom vectors here.
curl -X POST 'your_endpoint/indexes/my-first-structured-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H "Content-Type: application/json" \
-d '{
    "documents":[
        {
            "_id": "doc1",
            "my_custom_vector": {
                "vector": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480, 481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 504, 505, 506, 507, 508, 509, 510, 511],
                "content": "Singing audio file"
            }
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                "vector": [1.0, 0.5, 0.3333, 0.25, 0.2, 0.1667, 0.1429, 0.125, 0.1111, 0.1, 0.0909, 0.0833, 0.0769, 0.0714, 0.0667, 0.0625, 0.0588, 0.0556, 0.0526, 0.05, 0.0476, 0.0455, 0.0435, 0.0417, 0.04, 0.0385, 0.037, 0.0357, 0.0345, 0.0333, 0.0323, 0.0312, 0.0303, 0.0294, 0.0286, 0.0278, 0.027, 0.0263, 0.0256, 0.025, 0.0244, 0.0238, 0.0233, 0.0227, 0.0222, 0.0217, 0.0213, 0.0208, 0.0204, 0.02, 0.0196, 0.0192, 0.0189, 0.0185, 0.0182, 0.0179, 0.0175, 0.0172, 0.0169, 0.0167, 0.0164, 0.0161, 0.0159, 0.0156, 0.0154, 0.0152, 0.0149, 0.0147, 0.0145, 0.0143, 0.0141, 0.0139, 0.0137, 0.0135, 0.0133, 0.0132, 0.013, 0.0128, 0.0127, 0.0125, 0.0123, 0.0122, 0.012, 0.0119, 0.0118, 0.0116, 0.0115, 0.0114, 0.0112, 0.0111, 0.011, 0.0109, 0.0108, 0.0106, 0.0105, 0.0104, 0.0103, 0.0102, 0.0101, 0.01, 0.0099, 0.0098, 0.0097, 0.0096, 0.0095, 0.0094, 0.0093, 0.0093, 0.0092, 0.0091, 0.009, 0.0089, 0.0088, 0.0088, 0.0087, 0.0086, 0.0085, 0.0085, 0.0084, 0.0083, 0.0083, 0.0082, 0.0081, 0.0081, 0.008, 0.0079, 0.0079, 0.0078, 0.0078, 0.0077, 0.0076, 0.0076, 0.0075, 0.0075, 0.0074, 0.0074, 0.0073, 0.0072, 0.0072, 0.0071, 0.0071, 0.007, 0.007, 0.0069, 0.0069, 0.0068, 0.0068, 0.0068, 0.0067, 0.0067, 0.0066, 0.0066, 0.0065, 0.0065, 0.0065, 0.0064, 0.0064, 0.0063, 0.0063, 0.0063, 0.0062, 0.0062, 0.0061, 0.0061, 0.0061, 0.006, 0.006, 0.006, 0.0059, 0.0059, 0.0058, 0.0058, 0.0058, 0.0057, 0.0057, 0.0057, 0.0056, 0.0056, 0.0056, 0.0056, 0.0055, 0.0055, 0.0055, 0.0054, 0.0054, 0.0054, 0.0053, 0.0053, 0.0053, 0.0053, 0.0052, 0.0052, 0.0052, 0.0052, 0.0051, 0.0051, 0.0051, 0.0051, 0.005, 0.005, 0.005, 0.005, 0.0049, 0.0049, 0.0049, 0.0049, 0.0048, 0.0048, 0.0048, 0.0048, 0.0047, 0.0047, 0.0047, 0.0047, 0.0047, 0.0046, 0.0046, 0.0046, 0.0046, 0.0045, 0.0045, 0.0045, 0.0045, 0.0045, 0.0044, 0.0044, 0.0044, 0.0044, 0.0044, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0043, 0.0042, 0.0042, 0.0042, 0.0042, 0.0042, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.0041, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.004, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0039, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0038, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0037, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0036, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0035, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0034, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0033, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0032, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.0031, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.003, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0029, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0028, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0027, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0026, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0025, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0024, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0023, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0022, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.0021, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002, 0.002],
                "content": "Podcast audio file"
            }
        }
    ]
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

# For structured indexes, the custom vector field should be declared upon index creation (with type `custom_vector`).
# Create an index with the model that has the dimensions of your custom vectors. For example: "open_clip/ViT-B-32/laion2b_s34b_b79k" (dimension is 512).
# Only the model dimension matters, as we are not vectorising anything when using custom vector fields.
# Space type CANNOT be 'prenormalized-angular' for custom vectors, as they are not normalized.

settings = {
    "type": "structured",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "all_fields": [{"name": "my_custom_vector", "type": "custom_vector"}],
    "tensor_fields": ["my_custom_vector"],
    "ann_parameters": {
        "spaceType": "angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("my-first-structured-index", **settings)

# Random vectors for example purposes. replace these with your own.
example_vector_1 = [i for i in range(512)]
example_vector_2 = [1 / (i + 1) for i in range(512)]

# We add the custom vector documents into our structured index.
# We do NOT use mappings for custom vectors here.
res = mq.index("my-first-structured-index").add_documents(
    documents=[
        {
            "_id": "doc1",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_1,
                "content": "Singing audio file",
            },
        },
        {
            "_id": "doc2",
            "my_custom_vector": {
                # Put your own vector (of correct length) here.
                "vector": example_vector_2,
                "content": "Podcast audio file",
            },
        },
    ]
)

Note: Zero Magnitude Vector

When adding documents to a Marqo index, you may encounter an error related to a zero magnitude vector if normalizeEmbeddings is set to True during index creation and the custom vector provided for a document is a zero vector.

In such cases, Marqo will return the following message:

  • Error message: "Zero magnitude vector detected, cannot normalize"
  • HTTP status code: 400 Bad Request

This error occurs with the specific document that couldn't be added due to the zero vector. To avoid this error, ensure that non-zero vectors are provided as custom vectors when adding documents to the index.

Example: Map Score Modifiers

Structured Index

mq = marqo.Client("http://localhost:8882", api_key=None)

settings = {
    "type": "structured",
    "vectorNumericType": "float",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "normalizeEmbeddings": True,
    "textPreprocessing": {
        "splitLength": 2,
        "splitOverlap": 0,
        "splitMethod": "sentence",
    },
    "imagePreprocessing": {"patchMethod": None},
    "allFields": [
        {"name": "text_field", "type": "text", "features": ["lexical_search"]},
        {
            "name": "map_score_mods",
            "type": "map<text, float>",
            "features": ["score_modifier"],
        },
        {
            "name": "map_score_mods_int",
            "type": "map<text, int>",
            "features": ["score_modifier"],
        },
    ],
    "tensorFields": ["text_field"],
    "annParameters": {
        "spaceType": "prenormalized-angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("map-score-modifiers-index", settings_dict=settings)

docs = [
    {"_id": "1", "text_field": "a photo of a cat", "map_score_mods": {"a": 0.5}},
    {"_id": "2", "text_field": "a photo of a dog", "map_score_mods": {"b": 0.5}},
    {"_id": "3", "text_field": "a photo of a cat", "map_score_mods": {"c": 0.5}},
    {"_id": "4", "text_field": "a photo of a cat", "map_score_mods_int": {"a": 1}},
    {"_id": "5", "text_field": "a photo of a cat", "map_score_mods_int": {"b": 1}},
    {"_id": "6", "text_field": "a photo of a cat", "map_score_mods_int": {"c": 1}},
    {
        "_id": "7",
        "text_field": "a photo of a cat",
        "map_score_mods_int": {"c": 1},
        "map_score_mods": {"a": 0.5},
    },
]

res = mq.index("map-score-modifiers-index").add_documents(documents=docs)
print(res)

For Marqo Cloud, you will need your API Key. To obtain this key visit Find Your API Key.

mq = marqo.Client("https://api.marqo.ai", api_key="XXXXXXXXXXXXXXX")

settings = {
    "type": "structured",
    "vectorNumericType": "float",
    "model": "open_clip/ViT-B-32/laion2b_s34b_b79k",
    "normalizeEmbeddings": True,
    "textPreprocessing": {
        "splitLength": 2,
        "splitOverlap": 0,
        "splitMethod": "sentence",
    },
    "imagePreprocessing": {"patchMethod": None},
    "allFields": [
        {"name": "text_field", "type": "text", "features": ["lexical_search"]},
        {
            "name": "map_score_mods",
            "type": "map<text, float>",
            "features": ["score_modifier"],
        },
        {
            "name": "map_score_mods_int",
            "type": "map<text, int>",
            "features": ["score_modifier"],
        },
    ],
    "tensorFields": ["text_field"],
    "annParameters": {
        "spaceType": "prenormalized-angular",
        "parameters": {"efConstruction": 512, "m": 16},
    },
}

mq.create_index("map-score-modifiers-index", settings_dict=settings)

docs = [
    {"_id": "1", "text_field": "a photo of a cat", "map_score_mods": {"a": 0.5}},
    {"_id": "2", "text_field": "a photo of a dog", "map_score_mods": {"b": 0.5}},
    {"_id": "3", "text_field": "a photo of a cat", "map_score_mods": {"c": 0.5}},
    {"_id": "4", "text_field": "a photo of a cat", "map_score_mods_int": {"a": 1}},
    {"_id": "5", "text_field": "a photo of a cat", "map_score_mods_int": {"b": 1}},
    {"_id": "6", "text_field": "a photo of a cat", "map_score_mods_int": {"c": 1}},
    {
        "_id": "7",
        "text_field": "a photo of a cat",
        "map_score_mods_int": {"c": 1},
        "map_score_mods": {"a": 0.5},
    },
]

res = mq.index("map-score-modifiers-index").add_documents(documents=docs)
print(res)

Unstructured Index

mq = marqo.Client("http://localhost:8882", api_key=None)

mq.create_index("my-unstructured-index", model="open_clip/ViT-B-32/laion2b_s34b_b79k")
docs = [
    {"_id": "1", "text_field": "a photo of a cat", "map_score_mods": {"a": 0.5}},
    {"_id": "2", "text_field": "a photo of a dog", "map_score_mods": {"b": 0.5}},
    {"_id": "3", "text_field": "a photo of a cat", "map_score_mods": {"c": 0.5}},
    {"_id": "4", "text_field": "a photo of a cat", "map_score_mods_int": {"a": 1}},
    {"_id": "5", "text_field": "a photo of a cat", "map_score_mods_int": {"b": 1}},
    {"_id": "6", "text_field": "a photo of a cat", "map_score_mods_int": {"c": 1}},
    {
        "_id": "7",
        "text_field": "a photo of a cat",
        "map_score_mods_int": {"c": 1},
        "map_score_mods": {"a": 0.5},
    },
    {"_id": "8", "text_field": "a photo of a dog", "my_int": 2},
]

res = mq.index("my-unstructured-index").add_documents(
    documents=docs,
    tensor_fields=["text_field"],
)
print(res)

For Marqo Cloud, you will need your API Key. To obtain this key visit Find Your API Key.

mq = marqo.Client("https://api.marqo.ai", api_key="XXXXXXXXXXXXXXX")

mq.create_index("my-unstructured-index", model="open_clip/ViT-B-32/laion2b_s34b_b79k")
docs = [
    {"_id": "1", "text_field": "a photo of a cat", "map_score_mods": {"a": 0.5}},
    {"_id": "2", "text_field": "a photo of a dog", "map_score_mods": {"b": 0.5}},
    {"_id": "3", "text_field": "a photo of a cat", "map_score_mods": {"c": 0.5}},
    {"_id": "4", "text_field": "a photo of a cat", "map_score_mods_int": {"a": 1}},
    {"_id": "5", "text_field": "a photo of a cat", "map_score_mods_int": {"b": 1}},
    {"_id": "6", "text_field": "a photo of a cat", "map_score_mods_int": {"c": 1}},
    {
        "_id": "7",
        "text_field": "a photo of a cat",
        "map_score_mods_int": {"c": 1},
        "map_score_mods": {"a": 0.5},
    },
    {"_id": "8", "text_field": "a photo of a dog", "my_int": 2},
]

res = mq.index("my-unstructured-index").add_documents(
    documents=docs,
    tensor_fields=["text_field"],
)
print(res)

Media auth

Parameter: mediaDownloadHeaders

Expected value: An object that consists of key-value pair headers for image download. If set, Marqo will use this to authenticate the media download.

Default value: null

Example

mq.create_index(
    "my-first-index",
    treat_urls_and_pointers_as_images=True,
    model="open_clip/ViT-B-32/laion2b_s34b_b79k",
)
mq.index("my-first-index").add_documents(
    [
        {
            "img": "https://my-image-store.com/image_1.png",
            "title": "A lion roaming around...",
        },
        {
            "img": "https://my-image-store.com/image_2.png",
            "title": "Astronauts playing football",
        },
    ],
    media_download_headers={
        "my-image-store-api-key": "some-super-secret-image-store-key"
    },
    tensor_fields=["img", "title"],
)

Model auth

Parameter: modelAuth

Expected value: JSON object with either an s3 or an hf model store authorisation object.

Default value: null

The modelAuth object allows searching on indexes that use OpenCLIP and CLIP models from private Hugging Face and AWS S3 stores.

The modelAuth object contains either an s3 or an hf model store authorisation object. The model store authorisation object contains credentials needed to access the index's non publicly accessible model. See the example for details.

The index's settings must specify the non publicly accessible model's location in the setting's modelProperties object.

modelAuth is used to initially download the model. After downloading, Marqo caches the model so that it doesn't need to be redownloaded.

Example: AWS S3

# Create an index that specifies the non-public location of the model.
# Note the `auth_required` field in `modelProperties` which tells Marqo to use
# the modelAuth it finds during add_documents to download the model
mq.create_index(
    index_name="my-cool-index",
    settings_dict={
        "treatUrlsAndPointersAsImages": True,
        "model": 'my_s3_model',
        "normalizeEmbeddings": True,
        "modelProperties": {
            "name": "oViT-B-32",
            "dimensions": 512,
            "model_location": {
                "s3": {
                    "Bucket": "<SOME BUCKET>",
                    "Key": "<KEY TO IDENTIFY MODEL>",
                },
                "auth_required": True
            },
            "type": "open_clip",
        }
    }
)

# Specify the authorisation needed to access the private model during add_documents:
# We recommend setting up the credential's AWS user so that it has minimal 
# accesses needed to retrieve the model
mq.index("my-cool-index").add_documents(
    documents=[
        {'Title': 'The coolest moon walks'}
    ],
    model_auth={
        's3': {
            "aws_access_key_id": "<SOME ACCESS KEY ID>",
            "aws_secret_access_key": "<SOME SECRET ACCESS KEY>"
        }
    },
    tensor_fields=["Title"]
)

Example: Hugging Face (HF)

# Create an index that specifies the non-public location of the model.
# Note the `auth_required` field in `modelProperties` which tells Marqo to use
# the modelAuth it finds during add_documents to download the model
mq.create_index(
    index_name="my-cool-index",
    settings_dict={
        "treatUrlsAndPointersAsImages": True,
        "model": 'my_hf_model',
        "normalizeEmbeddings": True,
        "modelProperties": {
            "name": "ViT-B-32",
            "dimensions": 512,
            "model_location": {
                "hf": {
                    "repo_id": "<SOME HF REPO NAME>",
                    "filename": "<THE FILENAME TO DOWNLOAD>",
                },
                "auth_required": True
            },
            "type": "open_clip",
        }
    }
)

# specify the authorisation needed to access the private model during add_documents:
mq.index("my-cool-index").add_documents(
    documents=[
        {'Title': 'The coolest moon walks'}
    ],
    tensor_fields=['Title'],
    model_auth={
        'hf': {
            "token": "<SOME HF TOKEN>",
        }
    }
)

Client batch size (Python client only)

Parameter: client_batch_size

Expected value: An Integer greater than 0 and less than or equal to 128.

Default value: None

A Python client only helper parameter that splits up very large lists of documents into batches of a more manageable size for Marqo. If very large documents are being indexed, it is recommended that this to be set lower. A client_batch_size=24 is a good place to start, and then adjust this for your use case as necessary.

Example

many_documents = [
    {"_id": f"doc_{i}", "Title": f"This is document number {i}"} for i in range(10000)
]
mq.index("my-first-index").add_documents(
    many_documents, client_batch_size=24, tensor_fields=["Title"]
)

Text Chunk Prefixes

Parameters: textChunkPrefix

Expected value: A string.

Default value: ""

This field overrides the text chunk prefix set during the index's creation.

If the user does not specify the textChunkPrefix field, the prefixes as defined in the index settings will be used. If these don't exist, the model's defaults will be used.

Note: Users do not need to provide textChunkPrefix for e5 models unless you want to override our default prefixes.

Example: Adding prefixes to the text document chunks and queries when embedding. Overrides index defaults

curl -XPOST 'http://localhost:8882/indexes/{index_name}/documents' \
-H 'Content-type:application/json' -d '
{
    "documents": [
        {
            "Title": "The Travels of Marco Polo",
            "Description": "A 13th-century travelogue describing the travels of Polo",
            "Genre": "History"
        },
        {
            "Title": "Extravehicular Mobility Unit (EMU)",
            "Description": "The EMU is a spacesuit that provides environmental protection",
            "id": "article_591",
            "Genre": "Science"
        }
    ],
    "tensorFields": ["Description"],
    "textChunkPrefix": "override passage: "
}'
mq.index("{index_name}").add_documents(
    documents=[
        {
            "Title": "The Travels of Marco Polo",
            "Description": "A 13th-century travelogue describing the travels of Polo",
            "Genre": "History",
        },
        {
            "Title": "Extravehicular Mobility Unit (EMU)",
            "Description": "The EMU is a spacesuit that provides environmental protection",
            "id": "article_591",
            "Genre": "Science",
        },
    ],
    tensor_fields=["Description"],
    text_chunk_prefix="override passage: ",
)

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

curl -XPOST 'your_endpoint/indexes/{index_name}/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-type:application/json' -d '
{
    "documents": [
        {
            "Title": "The Travels of Marco Polo",
            "Description": "A 13th-century travelogue describing the travels of Polo",
            "Genre": "History"
        },
        {
            "Title": "Extravehicular Mobility Unit (EMU)",
            "Description": "The EMU is a spacesuit that provides environmental protection",
            "id": "article_591",
            "Genre": "Science"
        }
    ],
    "tensorFields": ["Description"],
    "textChunkPrefix": "override passage: "
}'
For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint.

mq.index("{index_name}").add_documents(
    documents=[
        {
            "Title": "The Travels of Marco Polo",
            "Description": "A 13th-century travelogue describing the travels of Polo",
            "Genre": "History",
        },
        {
            "Title": "Extravehicular Mobility Unit (EMU)",
            "Description": "The EMU is a spacesuit that provides environmental protection",
            "id": "article_591",
            "Genre": "Science",
        },
    ],
    tensor_fields=["Description"],
    text_chunk_prefix="override passage: ",
)

Adding Audio and Video documents

If you would like to index audio and/or video documents, you can do so by using the Languagebind models here. Make sure to set treatUrlsAndPointersAsMedia to true in the index settings for unstructured indexes. To run this example, please ensure that your system has at least 16GB of memory.

Unstructured Index Example

mq.create_index(
    "my-index",
    treat_urls_and_pointers_as_media=True,
    model="LanguageBind/Video_V1.5_FT_Audio_FT_Image",
)

documents = [
    {
        "_id": "video",
        "video": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4",
    },
    {
        "_id": "audio",
        "audio": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav",
    },
    {
        "_id": "image",
        "image": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png",
    },
]

mq.index("my-index").add_documents(documents, tensor_fields=["video", "audio", "image"])
curl -X POST 'http://localhost:8882/indexes/my-index' \
-H 'Content-Type: application/json' \
-d '{
    "treatUrlsAndPointersAsMedia": true,
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image"
}'

curl -X POST 'http://localhost:8882/indexes/my-index/documents' \
-H 'Content-Type: application/json' \
-d '{
    "documents": [
        {
            "_id": "video",
            "video": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4"
        },
        {
            "_id": "audio",
            "audio": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav"
        },
        {
            "_id": "image",
            "image": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png"
        }
    ],
    "tensorFields": ["video", "audio", "image"]
}'

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

mq = marqo.Client("https://api.marqo.ai", api_key="XXXXXXXXXXXXXXX")

mq.create_index(
    "my-index",
    treat_urls_and_pointers_as_media=True,
    model="LanguageBind/Video_V1.5_FT_Audio_FT_Image",
)

documents = [
    {
        "_id": "video",
        "video": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4",
    },
    {
        "_id": "audio",
        "audio": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav",
    },
    {
        "_id": "image",
        "image": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png",
    },
]

mq.index("my-index").add_documents(documents, tensor_fields=["video", "audio", "image"])
curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-Type: application/json' \
-d '{
    "treatUrlsAndPointersAsMedia": true,
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image"
}'

curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-Type: application/json' \
-d '{
    "documents": [
        {
            "_id": "video",
            "video": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4"
        },
        {
            "_id": "audio",
            "audio": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav"
        },
        {
            "_id": "image",
            "image": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png"
        }
    ],
    "tensorFields": ["video", "audio", "image"]
}'

Structured Index Example

settings = {
    "type": "structured",
    "vectorNumericType": "float",
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image",
    "normalizeEmbeddings": False,
    "tensorFields": ["video_field", "audio_field", "image_field", "multimodal_field"],
    "allFields": [
        {"name": "video_field", "type": "video_pointer"},
        {"name": "audio_field", "type": "audio_pointer"},
        {"name": "image_field", "type": "image_pointer"},
        {
            "name": "multimodal_field",
            "type": "multimodal_combination",
            "dependentFields": {
                "image_field": 0.3,
                "video_field": 0.4,
                "audio_field": 0.3,
            },
        },
    ],
}

mq.create_index("my-index", settings_dict=settings)

documents = [
    {
        "_id": "video",
        "video_field": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4",
    },
    {
        "_id": "audio",
        "audio_field": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav",
    },
    {
        "_id": "image",
        "image_field": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png",
    },
]

mq.index("my-index").add_documents(documents)
curl -X POST 'http://localhost:8882/indexes/my-index' \
-H 'Content-Type: application/json' \
-d '{
    "type": "structured",
    "vectorNumericType": "float",
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image",
    "normalizeEmbeddings": false,
    "tensorFields": ["video_field", "audio_field", "image_field", "multimodal_field"],
    "allFields": [
        {"name": "video_field", "type": "video_pointer"},
        {"name": "audio_field", "type": "audio_pointer"},
        {"name": "image_field", "type": "image_pointer"},
        {
            "name": "multimodal_field",
            "type": "multimodal_combination",
            "dependentFields": {
                "image_field": 0.3,
                "video_field": 0.4,
                "audio_field": 0.3
            }
        }
    ]
}'

curl -X POST 'http://localhost:8882/indexes/my-index/documents' \
-H 'Content-Type: application/json' \
-d '{
    "documents": [
        {
            "_id": "video",
            "video_field": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4"
        },
        {
            "_id": "audio",
            "audio_field": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav"
        },
        {
            "_id": "image",
            "image_field": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png"
        }
    ]
}'

For Marqo Cloud, you will need to access the endpoint of your index and replace your_endpoint with this. To do this, visit Find Your Endpoint. You will also need your API Key. To obtain this key visit Find Your API Key.

mq = marqo.Client("https://api.marqo.ai", api_key="XXXXXXXXXXXXXXX")

settings = {
    "type": "structured",
    "vectorNumericType": "float",
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image",
    "normalizeEmbeddings": False,
    "tensorFields": ["video_field", "audio_field", "image_field", "multimodal_field"],
    "allFields": [
        {"name": "video_field", "type": "video_pointer"},
        {"name": "audio_field", "type": "audio_pointer"},
        {"name": "image_field", "type": "image_pointer"},
        {
            "name": "multimodal_field",
            "type": "multimodal_combination",
            "dependentFields": {
                "image_field": 0.3,
                "video_field": 0.4,
                "audio_field": 0.3,
            },
        },
    ],
}

mq.create_index("my-index", settings_dict=settings)

documents = [
    {
        "_id": "video",
        "video_field": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4",
    },
    {
        "_id": "audio",
        "audio_field": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav",
    },
    {
        "_id": "image",
        "image_field": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png",
    },
]

mq.index("my-index").add_documents(documents)
curl -X POST 'https://api.marqo.ai/api/v2/indexes/my-index' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-Type: application/json' \
-d '{
    "type": "structured",
    "vectorNumericType": "float",
    "model": "LanguageBind/Video_V1.5_FT_Audio_FT_Image",
    "normalizeEmbeddings": false,
    "tensorFields": ["video_field", "audio_field", "image_field", "multimodal_field"],
    "allFields": [
        {"name": "video_field", "type": "video_pointer"},
        {"name": "audio_field", "type": "audio_pointer"},
        {"name": "image_field", "type": "image_pointer"},
        {
            "name": "multimodal_field",
            "type": "multimodal_combination",
            "dependentFields": {
                "image_field": 0.3,
                "video_field": 0.4,
                "audio_field": 0.3
            }
        }
    ]
}'

curl 'https://api.marqo.ai/api/v2/indexes/my-index/documents' \
-H 'x-api-key: XXXXXXXXXXXXXXX' \
-H 'Content-Type: application/json' \
-d '{
    "documents": [
        {
            "_id": "video",
            "video_field": "https://marqo-k400-video-test-dataset.s3.amazonaws.com/videos/---QUuC4vJs_000084_000094.mp4"
        },
        {
            "_id": "audio",
            "audio_field": "https://marqo-ecs-50-audio-test-dataset.s3.amazonaws.com/audios/4-145081-A-9.wav"
        },
        {
            "_id": "image",
            "image_field": "https://raw.githubusercontent.com/marqo-ai/marqo-api-tests/mainline/assets/ai_hippo_realistic.png"
        }
    ]
}'