I often have to import "huge" files and it is suggested to import such files in parallel. This can be "easily" done using tools like GNU parallel, but in most cases it is not so easy to do it right (for example GNUs parallel's --pipe option should not be used, because it is slow; someone should use pigz instead of gzip, which does not use multiple cores).
So I would really really like to see an new option to simply import large files and the program automatically uses multiple connections or even uncompresses files on the fly. This would be also a benefit for all platforms that support python, but not GNU parallel.
I often have to import "huge" files and it is suggested to import such files in parallel. This can be "easily" done using tools like GNU parallel, but in most cases it is not so easy to do it right (for example GNUs parallel's
--pipeoption should not be used, because it is slow; someone should use pigz instead of gzip, which does not use multiple cores).So I would really really like to see an new option to simply import large files and the program automatically uses multiple connections or even uncompresses files on the fly. This would be also a benefit for all platforms that support python, but not GNU parallel.