python - Parallel processing with Numpy.loadtxt() -


i have >100mb file needs read numpy.loadtxt()

the reading part main bottleneck in code. 72mb file takes 17.3s

is somehow possible read in parallel way file using loadtxt()

if possible without splitting file.

it looks numpy.loadtxt() problem.

http://wesmckinney.com/blog/?p=543

http://codrspace.com/durden/performance-lessons-for-reading-ascii-files-into-numpy-arrays/

according these sites, you're better off not using numpy's load function @ all.

pandas.read_csv , read_table should helpful pandas module.


Comments

Popular posts from this blog

user interface - How to replace the Python logo in a Tkinter-based Python GUI app? -

android - Get AccessToken using signpost OAuth without opening a browser (Two legged Oauth) -

org.mockito.exceptions.misusing.InvalidUseOfMatchersException: mockito -