I have to store some message in ElasticSearch integrate with my python program. Now what I try to store the message is:
d={"message":"this is message"}
for index_nr in range(1,5):
ElasticSearchAPI.addToIndex(index_nr, d)
print d
That means if I have 10 messages then I have to repeat my code 10 times. So what I want to do is try to make a script file or batch file. I've checked the ElasticSearch Guide, BULK API is possible to use. The format should be something like below:
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
{ "delete" : { "_index" : "test", "_type" : "type1", "_id" : "2" } }
{ "create" : { "_index" : "test", "_type" : "type1", "_id" : "3" } }
{ "field1" : "value3" }
{ "update" : {"_id" : "1", "_type" : "type1", "_index" : "index1"} }
{ "doc" : {"field2" : "value2"} }
what I did is:
{"index":{"_index":"test1","_type":"message","_id":"1"}}
{"message":"it is red"}
{"index":{"_index":"test2","_type":"message","_id":"2"}}
{"message":"it is green"}
I also use curl tool to store the doc.
$ curl -s -XPOST localhost:9200/_bulk --data-binary @message.json
Now I want to use my Python code to store the file to the Elastic Search.
from datetime import datetime
from elasticsearch import Elasticsearch
from elasticsearch import helpers
es = Elasticsearch()
actions = [
{
"_index": "tickets-index",
"_type": "tickets",
"_id": j,
"_source": {
"any":"data" + str(j),
"timestamp": datetime.now()}
}
for j in range(0, 10)
]
helpers.bulk(es, actions)