Skip to content

Commit d602c5f

Browse files
feat(specs): add global push endpoint (generated)
algolia/api-clients-automation#4855 Co-authored-by: algolia-bot <accounts+algolia-api-client-bot@algolia.com> Co-authored-by: Clément Vannicatte <vannicattec@gmail.com>
1 parent 6b42669 commit d602c5f

File tree

2 files changed

+64
-16
lines changed

2 files changed

+64
-16
lines changed

lib/algolia/api/ingestion_client.rb

Lines changed: 63 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2138,14 +2138,73 @@ def list_transformations(items_per_page = nil, page = nil, sort = nil, order = n
21382138
)
21392139
end
21402140

2141-
# Push a `batch` request payload through the Pipeline. You can check the status of task pushes with the observability endpoints.
2141+
# Pushes records through the Pipeline, directly to an index. You can make the call synchronous by providing the `watch` parameter, for asynchronous calls, you can use the observability endpoints and/or debugger dashboard to see the status of your task. If you want to leverage the [pre-indexing data transformation](https://www.algolia.com/doc/guides/sending-and-managing-data/send-and-update-your-data/how-to/transform-your-data/), this is the recommended way of ingesting your records. This method is similar to `pushTask`, but requires an `indexName` instead of a `taskID`. If zero or many tasks are found, an error will be returned.
2142+
#
2143+
# Required API Key ACLs:
2144+
# - addObject
2145+
# - deleteIndex
2146+
# - editSettings
2147+
# @param index_name [String] Name of the index on which to perform the operation. (required)
2148+
# @param push_task_payload [PushTaskPayload] (required)
2149+
# @param watch [Boolean] When provided, the push operation will be synchronous and the API will wait for the ingestion to be finished before responding.
2150+
# @param request_options: The request options to send along with the query, they will be merged with the transporter base parameters (headers, query params, timeouts, etc.). (optional)
2151+
# @return [Http::Response] the response
2152+
def push_with_http_info(index_name, push_task_payload, watch = nil, request_options = {})
2153+
# verify the required parameter 'index_name' is set
2154+
if @api_client.config.client_side_validation && index_name.nil?
2155+
raise ArgumentError, "Parameter `index_name` is required when calling `push`."
2156+
end
2157+
# verify the required parameter 'push_task_payload' is set
2158+
if @api_client.config.client_side_validation && push_task_payload.nil?
2159+
raise ArgumentError, "Parameter `push_task_payload` is required when calling `push`."
2160+
end
2161+
2162+
path = "/1/push/{indexName}".sub("{" + "indexName" + "}", Transport.encode_uri(index_name.to_s))
2163+
query_params = {}
2164+
query_params[:watch] = watch unless watch.nil?
2165+
query_params = query_params.merge(request_options[:query_params]) unless request_options[:query_params].nil?
2166+
header_params = {}
2167+
header_params = header_params.merge(request_options[:header_params]) unless request_options[:header_params].nil?
2168+
request_options[:timeout] ||= 180000
2169+
request_options[:connect_timeout] ||= 180000
2170+
2171+
post_body = request_options[:debug_body] || @api_client.object_to_http_body(push_task_payload)
2172+
2173+
new_options = request_options.merge(
2174+
:operation => :"IngestionClient.push",
2175+
:header_params => header_params,
2176+
:query_params => query_params,
2177+
:body => post_body,
2178+
:use_read_transporter => false
2179+
)
2180+
2181+
@api_client.call_api(:POST, path, new_options)
2182+
end
2183+
2184+
# Pushes records through the Pipeline, directly to an index. You can make the call synchronous by providing the `watch` parameter, for asynchronous calls, you can use the observability endpoints and/or debugger dashboard to see the status of your task. If you want to leverage the [pre-indexing data transformation](https://www.algolia.com/doc/guides/sending-and-managing-data/send-and-update-your-data/how-to/transform-your-data/), this is the recommended way of ingesting your records. This method is similar to `pushTask`, but requires an `indexName` instead of a `taskID`. If zero or many tasks are found, an error will be returned.
2185+
#
2186+
# Required API Key ACLs:
2187+
# - addObject
2188+
# - deleteIndex
2189+
# - editSettings
2190+
# @param index_name [String] Name of the index on which to perform the operation. (required)
2191+
# @param push_task_payload [PushTaskPayload] (required)
2192+
# @param watch [Boolean] When provided, the push operation will be synchronous and the API will wait for the ingestion to be finished before responding.
2193+
# @param request_options: The request options to send along with the query, they will be merged with the transporter base parameters (headers, query params, timeouts, etc.). (optional)
2194+
# @return [WatchResponse]
2195+
def push(index_name, push_task_payload, watch = nil, request_options = {})
2196+
response = push_with_http_info(index_name, push_task_payload, watch, request_options)
2197+
@api_client.deserialize(response.body, request_options[:debug_return_type] || "Ingestion::WatchResponse")
2198+
end
2199+
2200+
# Pushes records through the Pipeline, directly to an index. You can make the call synchronous by providing the `watch` parameter, for asynchronous calls, you can use the observability endpoints and/or debugger dashboard to see the status of your task. If you want to leverage the [pre-indexing data transformation](https://www.algolia.com/doc/guides/sending-and-managing-data/send-and-update-your-data/how-to/transform-your-data/), this is the recommended way of ingesting your records. This method is similar to `push`, but requires a `taskID` instead of a `indexName`, which is useful when many `destinations` target the same `indexName`.
21422201
#
21432202
# Required API Key ACLs:
21442203
# - addObject
21452204
# - deleteIndex
21462205
# - editSettings
21472206
# @param task_id [String] Unique identifier of a task. (required)
2148-
# @param push_task_payload [PushTaskPayload] Request body of a Search API `batch` request that will be pushed in the Connectors pipeline. (required)
2207+
# @param push_task_payload [PushTaskPayload] (required)
21492208
# @param watch [Boolean] When provided, the push operation will be synchronous and the API will wait for the ingestion to be finished before responding.
21502209
# @param request_options: The request options to send along with the query, they will be merged with the transporter base parameters (headers, query params, timeouts, etc.). (optional)
21512210
# @return [Http::Response] the response
@@ -2181,14 +2240,14 @@ def push_task_with_http_info(task_id, push_task_payload, watch = nil, request_op
21812240
@api_client.call_api(:POST, path, new_options)
21822241
end
21832242

2184-
# Push a `batch` request payload through the Pipeline. You can check the status of task pushes with the observability endpoints.
2243+
# Pushes records through the Pipeline, directly to an index. You can make the call synchronous by providing the `watch` parameter, for asynchronous calls, you can use the observability endpoints and/or debugger dashboard to see the status of your task. If you want to leverage the [pre-indexing data transformation](https://www.algolia.com/doc/guides/sending-and-managing-data/send-and-update-your-data/how-to/transform-your-data/), this is the recommended way of ingesting your records. This method is similar to `push`, but requires a `taskID` instead of a `indexName`, which is useful when many `destinations` target the same `indexName`.
21852244
#
21862245
# Required API Key ACLs:
21872246
# - addObject
21882247
# - deleteIndex
21892248
# - editSettings
21902249
# @param task_id [String] Unique identifier of a task. (required)
2191-
# @param push_task_payload [PushTaskPayload] Request body of a Search API `batch` request that will be pushed in the Connectors pipeline. (required)
2250+
# @param push_task_payload [PushTaskPayload] (required)
21922251
# @param watch [Boolean] When provided, the push operation will be synchronous and the API will wait for the ingestion to be finished before responding.
21932252
# @param request_options: The request options to send along with the query, they will be merged with the transporter base parameters (headers, query params, timeouts, etc.). (optional)
21942253
# @return [WatchResponse]

lib/algolia/models/ingestion/action.rb

Lines changed: 1 addition & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -12,20 +12,9 @@ class Action
1212
UPDATE_OBJECT = "updateObject".freeze
1313
PARTIAL_UPDATE_OBJECT = "partialUpdateObject".freeze
1414
PARTIAL_UPDATE_OBJECT_NO_CREATE = "partialUpdateObjectNoCreate".freeze
15-
DELETE_OBJECT = "deleteObject".freeze
16-
DELETE = "delete".freeze
17-
CLEAR = "clear".freeze
1815

1916
def self.all_vars
20-
@all_vars ||= [
21-
ADD_OBJECT,
22-
UPDATE_OBJECT,
23-
PARTIAL_UPDATE_OBJECT,
24-
PARTIAL_UPDATE_OBJECT_NO_CREATE,
25-
DELETE_OBJECT,
26-
DELETE,
27-
CLEAR
28-
].freeze
17+
@all_vars ||= [ADD_OBJECT, UPDATE_OBJECT, PARTIAL_UPDATE_OBJECT, PARTIAL_UPDATE_OBJECT_NO_CREATE].freeze
2918
end
3019

3120
# Builds the enum from string

0 commit comments

Comments
 (0)