I have only one large JSON file. For example,
{
"Name": "Motor_M23",
"AASID": {
"IDType": "URI",
"IDSpec": "http://acplt.org/AAS/Motor_M23"
},
"AssetID": {
"IDType": "URI",
"IDSpec": "http://acplt.org/Assets/Motor_M23"
},
"Header": {
"PropertyValueStatementContainers": [
{
"Name": "Config",
.
.
.
.
I need to support operations following operations:
Querying for an element should return all child elements e.g. Querying for AssetID
should return
"AssetID": {
"IDType": "URI",
"IDSpec": "http://acplt.org/Assets/Motor_M23"
}
Update value of elements.
AssetID
child element of AASID
.I considered following approaches:
Is there any good database out there which can load data from large JSON and handle my operations?
If you are only working with JSON then you should really use a document oriented database as it will save you having to wrestle something sql related.
MongoDB is a good choice, supports many drivers and can deal with tree structures (Though I'm not sure about the automatic creation)
CRUD operations are simple and cover a wide range of cases.
For very large datasets on busy servers you should use the XFS file system and the WiredTiger storage engine as there are some gains in performance.
It's well supported, and isn't that much of a learning curve. (I came from Pure SQL without too much trouble)
You also have the option of MariaDB or MySQL which also both support JSON though I have no experience with either, and in the case of MySQL I feel it was just a 'bolt on' which had to be added in the face of an up-coming requirement.