Elasticsearch Delete By Query API: Curl is working but failing to achieve same using Python (requests)?

python elasticsearch delete by query
elasticsearch bulk update by query
elasticsearch bulk delete
elasticsearch delete api
elasticsearch delete by query not working
elasticsearch delete by query curl
elasticsearch delete old data
elasticsearch delete by query multiple fields

I am making use of Delete By Query API to delete a bunch of documents. Below curl is working perfect:

POST /tom-access/doc/_delete_by_query
{
  "query": {
    "terms": {
      "_id": [
        "xxxxx",
        "yyyyy"
      ]
    }
  }
}

Now, I want to make use of requests library in Python to achieve the same.

import requests,json

url = "http://elastic.tool.com:80/tom-access/doc/_delete_by_query"
headers = {"Content-type": "application/json", "Accept": "application/json", "Authorization": "Basic asdadsasdasdasd"}

data = {
        'query':{
                'terms':{
                        '_id':[
                                'xxxxx',
                                'yyyyy'
                        ]
                }
        }
}

try:
    r = requests.post(url,
                 headers=headers,
                 data=data,
                 verify=False)
except blablaaa

response_dict = r.json()
print(response_dict)

I am getting below error:

{'error': {'root_cause': [{'type': 'json_parse_exception', 'reason': "Unrecognized token 'query': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@bc04803; line: 1, column: 7]"}], 'type': 'json_parse_exception', 'reason': "Unrecognized token 'query': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@bc04803; line: 1, column: 7]"}, 'status': 500}

What am i doing wrong?

I think you should try using double quotes ("") in the data variable instead of single quotes (''). Also, convert the query using json.dumps(). Here is an example from https://marcobonzanini.com/2015/02/02/how-to-query-elasticsearch-with-python/, where it shows the use of the requests library:

def search(uri, term):
    """Simple Elasticsearch Query"""
    query = json.dumps({
        "query": {
            "match": {
                "content": term
            }
        }
    })
    response = requests.get(uri, data=query)
    results = json.loads(response.text)
    return results

There is also the official elasticsearch client for python elasticsearch-py.

Delete by query API | Elasticsearch Reference [7.8], Documents with a version equal to 0 cannot be deleted using delete by query If the maximum retry limit is reached, processing halts and all failed requests are returned in the response. request, and returns a task you can use to cancel or get the status of the task. Both work exactly the way they work in the Bulk API. 2 Elasticsearch Delete By Query API: Curl is working but failing to achieve same using Python (requests)? Apr 11 '19 1 GraphQL - Only Return Data Which Contains a Particular Field Apr 10 '19

you need to change the way you are firing the request from python.

so instead of,

r = requests.post(url,
                 headers=headers,
                 data=data,
                 verify=False)

try using,

r = requests.post(url,
                 headers=headers,
                 data=json.dumps(data),
                 verify=False)

Bulk API | Elasticsearch Reference [7.8], delete does not expect a source on the next line and has the same semantics as the standard delete API. When using the HTTP API, make sure that the client does not send HTTP chunks Submitting bulk requests with cURLedit If no response is received before the timeout expires, the request fails and returns an error. 1 Delete By Query API working as curl but not in Node-Red Apr 10 '19 1 Elasticsearch Delete By Query API: Curl is working but failing to achieve same using Python (requests)? Apr 11 '19

It complains you are not passing your data structure in JSON data format, so you need to dump it to JSON first. Also Python requests library have a short-cut for this, so you do not need to dump your variable into JSON with:

r = requests.post(url,
                 headers=headers,
                 data=json.dumps(data),
                 verify=False)

Instead you can just use json=data option like this:

r = requests.post(url,
                 headers=headers,
                 json=data,
                 verify=False)

Delete By Query API | Elasticsearch Reference [6.8], You can also use the q parameter in the same way as the search API. That means that you'll get a version conflict if the document changes between the time In case a search or bulk request got rejected, _delete_by_query relies on a Delete by query is implemented using batches, and any failure causes the entire � Also, using the CreateReplicationGroup API or the ModifyReplicationGroup API you can configure an existing replication group to be Multi-AZ compliant. Moving a replication group to be Multi-AZ does not interfere with its ability to serve requests. Q: Can I have all my replicas in the same AZ as the primary? Yes.

Update By Query API | Elasticsearch Reference [7.8], If no query is specified, performs an update on every document in the index without Any query or update failures cause the update by query request to fail and the failures are When you are done with a task, you should delete the task document so You can also use the q parameter in the same way as the search API. An online discussion community of IT professionals. Forums to get free computer help and support. We are a social technology publication covering all aspects of tech support, programming, web development and Internet marketing.

Python + Elasticsearch. First steps., Elastic{ON}15, the first ES conference is coming, and since Using a restful API, Elasticsearch saves data and indexes it Now we can talk with each node and receive the same data, they are #let's iterate over swapi people documents and index them import json r = requests.get('http://localhost:9200') i� I have been working with JSON API and AJAX to load my blog contents into a ul listview. This has been possible by using either the for loop or return statements to limit the number of items. What I would like to do is only load page contents so if page 1 has 8 items in it then it should only lis

Mocking External APIs in Python – Real Python, The with statement and the decorator accomplish the same goal: Both methods patch project.services.request.get . project/tests/test_todos.py. # Standard library � Alternatively, you can see a list of the user-created DB Snapshots for a given DB Instance using the DescribeDBSnapshots API or describe-db-snapshots command and delete snapshots with the DeleteDBSnapshot API or delete-db-snapshot command.

Comments
  • Isn't the url wrong?
  • I messed up when copying. That is not the issue. Updated the question.