Any thoughts? json data killed app, but same data in csv does not

I ran a input json file into EDT, and after a while it was clear that it was not going to complete, and indeed, jammed the whole machine (CTRL/ALT /DEL did not work).

I ran this twice more to confirm this, with this task being the only one on the machine.
I then converted the file to CSV, and it completed in a few seconds. It was a flat file with 68 rows, so it’s a quirk somewhere in the json logic.

It looked a bit like a it was generating a memory leak somewhere, as the machine locked up with 32GB used.

Machine was Asus mobo / i9-9900 / 32 Gb / Samsung EVO 970 SSD

Has anyone else experienced this ? [PS I’m a newbie, so haven’t scanned the board completely as yet, so apologies if this is known.

It is possible there is a bug. But we have seen cases where a deeply nested JSON file expands into such a large table that it uses all the machine memory. Unfortunately we are currently having problem trapping running out of memory on Windows (works fine on Mac).

Can you email support with a copy of the JSON file to we can check it here?

The other thing to try is to experiment with changing Format from Wide (more columns) to Long (more rows) for the input. You can’t make that change if the process is hung. But you can import a small JSON file, change the Format and then change the input to be the problem file.

Thanks for sending the data. I can confirm it is a bug that causes a combinatorial explosion when Easy Data Transform converts this particular dataset to Long format. I hope to have a fix soon.