pandas中的并行read_table

时间:2013-11-12 23:16:42

标签: python numpy pandas

有没有办法并行调用read_table()?在我的情况下,它由于日期解析而受CPU限制。我没有看到通过阅读文档来实现这一目标的任何方法。唯一想到的是分割输入文件,并行调用read_table然后连接数据帧。

1 个答案:

答案 0 :(得分:1)

这将并行读取CSV文件并将它们连接起来。令人讨厌的一点是它不会处理numpy类型,所以它无法解析日期。我一直在努力解决同样的问题,但到目前为止,execnet这样的库似乎无法处理非内置的类型。这就是我在发送之前将DataFrames转换为json的原因。它将类型剥离为基本的Python类型。

编辑:如果您需要解析日期,可能更合理的方法是远程阅读CSV文件,解析日期并将其保存为pickle硬盘。然后你可以读取主进程中的pickle文件并连接它们。我没有尝试过,看它是否会带来性能提升。

<强> remote_read_csv.py

import cPickle as pickle

if __name__ == '__channelexec__':
    reader = pickle.loads(channel.receive())

    for filename in channel:
        channel.send(reader(filename).to_json())

以下使用上述模块。我在IPython中测试过它。

from pandas import DataFrame, concat, read_csv, read_json
from numpy import random
import execnet
import remote_read_csv
import cPickle as pickle
import itertools
import psutil

### Create dummy data and save to CSV

def rdf():
    return DataFrame((random.rand(4, 3) * 100).astype(int))

d1 = rdf()
d2 = rdf()
d3 = rdf()

dfsl = [d1, d2, d3]
names = 'd1.csv d2.csv d3.csv'.split()
for i in range(3):
    dfsl[i].to_csv(names[i])

### Read CSV files in separate threads then concatenate

reader = pickle.dumps(read_csv)

def set_gateways(remote_module, *channel_sends):
    gateways = []
    channels = []
    for i in range(psutil.NUM_CPUS):
        gateways.append(execnet.makegateway())
        channels.append(gateways[i].remote_exec(remote_module))
        for send in channel_sends:
            channels[i].send(send)
    return (gateways, channels)

def para_read(names):
    gateways, channels = set_gateways(remote_read_csv, reader)
    mch = execnet.MultiChannel(channels)
    queue = mch.make_receive_queue()
    channel_ring = itertools.cycle(mch)
    for f in names:
        channel = channel_ring.next()
        channel.send(f)
    dfs = []
    for i in range(len(names)):
        channel, df = queue.get()
        dfs.append(df)

    [gw.exit() for gw in gateways]
    return concat([read_json(i) for i in dfs], keys=names)

para_read(names)