将数据从csv插入到postgre

时间:2019-10-17 11:00:17

标签: python

我仍在学习python,并尝试将大型csv上传到postgre。代码正在工作并在15分钟内上载csv。问题是我停止了/崩溃了之间的代码(在运行插入时),之后我无法执行任何操作,例如select,truncate,因为每次挂起(甚至是手动)。 postgre安装在aws上,我无法理解这是否是代码或aws的问题?有人遇到过这个问题吗?

def load_main_table(sql_table_name,start_date):

csvFile= r'Y:\test.csv' #PartnerMONTHLYPRICING - 2019-09-01_1.csv'
connection=DBCon.get_connection_by_config(r'Y:\database.ini', 'postgresql_conn_data')#GetCon()

cursor=connection.cursor()

cursor.execute("""truncate table "db".table;""")

print("Table Truncated")


global chunk # to QC chunks that fail for some reason
total = 10000
#csv_file=GetCsv
for chunk in pd.read_csv(csv_file, sep='|',chunksize=20000,low_memory=False):
    chunk.replace('"','',inplace=True, regex=True)
    chunk.replace('\'','\'\'',inplace=True,regex=True) #replace single quotes in data with double single quotes to escape it in mysql
    chunk.insert(49, "start_date", start_date)
    chunk.fillna('NULL',inplace=True)
    my_data = str(chunk.to_records(index=False).tolist()) # convert data to string 
    my_data = my_data[1:-1] # clean up the ends
    my_data = my_data.replace('\"','\'').replace('\'NULL\'','NULL') #convert blanks to NULLS 




    sql = """
        INSERT INTO db.table
        VALUES {0}

         """.format(my_data)
    #print (sql)     
    cursor.execute(sql)
        # you must call commit() to persist your data if you don't set autocommit to True
    connection.commit()
    total=total+10000
    print("Imported " + str(total) + " rows in " +sql_table_name )    
cursor.close()

0 个答案:

没有答案
相关问题