Support for JQ-based JSON transformations during import of JSON data

XMLWordPrintable

      Summary:
      It would be great if Insight would allow to transform the JSON data of the source to better match the intended structure within Insight. JQ (https://stedolan.github.io/jq/]) and the Java-based OpenSource implementation Jackson-JQ (https://github.com/eiiches/jackson-jq) do just that.

      Use case / current workaround:
      I'd like to give you some more information about how I'm currently importing data from JSON-based REST APIs into Insight by converting them into CSV. Maybe you can make that process available within Insight, without having to pre-process the JSON data outside of Insight.I'm using JQ (https://stedolan.github.io/jq/]) to transform the JSON to a simple list of objects with attributes and then I use json2csv (http://www.mircozeiss.com/json2csv/)]) to convert this list into CSV.*Here's a simple example:*Given the following JSON structure (as example):

      {    "count": 94,    "next": null,    "previous": null,    "results": [        {            "subject": "...",            "subject_url": "...",            "subject_type": "Netbox",            "on_maintenance": false,            "acknowledgement": null,            "event_history_url": "...",            "netbox_history_url": "...",            "device_groups": [            ],            "alert_type": {                "name": "boxDown",                "description": "Box declared down."            },            "event_type": {                "id": "boxState",                "description": "Tells us whether a network-unit is down or up."            },            "start_time": "...",            "end_time": "...",            "id": 12461,            "source": "pping",            "device": null,            "netbox": 187,            "subid": "",            "value": 1,            "severity": 50        },        {           ...       }    ] }

      I use the following command line to transform it to a CSV file that can be easily imported and works very well for auto-creating the object schema and the import object type mapping (which reduces manual work a lot):

      curl -s <url> | jq '[.results[] | {id,subject,subject_url,subject_type,...}] | unique_by(.id)' | json2csv > data.csv

      That produces the following intermediate JSON:

      [  {    "id": 146,    "subject": "...",    "subject_url": "...",    "subject_type": "Netbox",    ... },  {   ... } ]

      and in turn generates the following CSV:

      "id","subject","subject_url","subject_type",...146,"...","...","Netbox",...

      Suggested implementation:

      Since it does not make sense to depend on command line tools for a JIRA plugin a Java-based solution like Jackson-JQ (https://github.com/eiiches/jackson-jq).That way you get the full expressiveness of JQ to transform complex data into a simple table that can then be imported using automated creation of the schema and import mapping.

            Assignee:
            Unassigned
            Reporter:
            Alex Cooksey (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

              Created:
              Updated: