Batch transformation columns offset or lost

Hello !

I got several .json files and I would like to extract some columns. When I do the individual filtering with “remove columns”, all looks fine but in batch process the results “drift” and are unusable.

Lets say, I want to extract “contentId” and “contents.0.medias.0.src.file” files from the attached databases. Seems simple task but batch does not work.

probably error is in me, pls help !

Also, where could I upload the sample data ?

seems im not allowed do to that…

Hi @Madis

Batch processing relies on the columns being the same each time.

You may be able to achieve this by changing the Format option for the JSON input from Wide (more columns) to Long (more rows) .

If that doesn’t work you will need to use a Stack transform to put all the columns into a pre-defined order:

Thanks for answer.

But the column names are the same ? Probably im missing something…

Also, can You pls explain how to use Stack ?

I got about 4000 json files - actually these are radio station timetables - and I want to extract “Program name” and “Start time” etc to excel file. Which would be about 20000 … 40000 lines.

I checked, the field names are the same no matter when the json is made (from 2011 to 2023 year)

Ok, I missed the link you added. I understand that link but Can You pls help how to define BATCH with 2 inputs - one stays the same but other is changing.

You need to set the file that is changing to a wildcard (e.g. c:\myfolder*.json) in the batch window. See:

There is a link to a video on the above page.