Parse nested Json from logic apps
In this post, I will give a brief walkthrough of handling the nested json data and extracting the contents from it.
My sample file is as follows .
{
"id": "0001",
"type": "donut",
"name": "Cake",
"ppu": 0.55,
"batters":
{
"batter":
[
{ "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" },
{ "id": "1004", "type": "Devil's Food" }
]
},
"topping":
[
{ "id": "5001", "type": "None" },
{ "id": "5002", "type": "Glazed" },
{ "id": "5005", "type": "Sugar" },
{ "id": "5007", "type": "Powdered Sugar" },
{ "id": "5006", "type": "Chocolate with Sprinkles" },
{ "id": "5003", "type": "Chocolate" },
{ "id": "5004", "type": "Maple" }
]
}
If you notice , the file has nested contents in it. Here, we are parsing the Json contents and read the value in ‘Toppings’ section and create a file for each topping.
Create a new logic apps. I begin adding ‘Recurrence’.
Above file can be either saved in a location or use a variable to initialize the contents. I opted the latter and created a variable of type ‘Object’

Add the next action ‘Parse Json’ . For schema, we can use the above file in the payload which generates the schema for us.

As our plan is to generate a file for each topping, use a for loop which iterates through the file based on the section we select .

If we revisit the file again ,we can observe there are 2 nested sections named batter/topping in the main json . Let us select ‘topping’ .
Next step is to create the file in blob storage with proper names.
As we have selected toppings, the options displayed in the ‘blob name’ will correspond to the contents in this section. But we cannot assign the value directly here as it will be holding the value for individual toppings. So we need make use of another expression ‘item()’ to read the id values.

The content of the file will be the current buffer in the loop.
Now on executing the logic apps, we can observe there are 7 loops executed by for each which represents the toppings count in the file.


This was originally posted here.
*This post is locked for comments