Is it running out of memory when adding the input, or when adding transforms?
Easy Data Transform keeps everything in memory, for speed. It tries to be clever with the memory using reference counting. How much memory is needed depends on how much data is in each cell and and how many unique values there are in each column. But 8m rows x 200 cols is pushing it. Especially if you start adding various transforms, which will use up more memory.
You could try setting the maximum memory significantly higher. But EDT will start to slow down if it has to use virtual memory (paging from RAM to disk).
Is it possible to split the file into several smaller section first?
If you can send me a copy of the file, I can investigate further. I recommend using swisstransfer.com or similar.
I loaded a 4GB (31 million rows x 15 cols) CSV file in EDT v1.27.0. it took 88% of 10 GB memory allowed. There are quite a lot of repeat data values in the is CSV (which can be compressed in memory).