查询大型数据集时,请防止脚本超时

时间:2012-04-10 17:12:52

标签: php mysql timeout

我有以下代码:

$query = mysql_query("SELECT * FROM mytable");

while($row = mysql_fetch_assoc($query)){
    mysql_query("INSERT INTO mytable2 (col1, col2) 
                 VALUES ('".$row['val1']."', '".$row['val2']."')");
}

可以理解的是,脚本在大约150,000个查询中超时...除了增加脚本内存之外什么是防止超时的最佳方法?

2 个答案:

答案 0 :(得分:6)

为什么不将它作为单个查询运行???

$SQL = "INSERT INTO mytable2 (col1,col2) SELECT val1,val2 FROM mytable";
$query = mysql_query($SQL); 

ALTERNATIVE

您也可以一次限制INSERT 200

$query = mysql_query("SELECT * FROM mytable"); 
$commit_count = 0;    
$commit_limit = 200;
$comma = "";
$SQL = "INSERT INTO mytable2 (col1, col2) VALUES ";
while($row = mysql_fetch_assoc($query)){ 
    $SQL .= $comma . "('".$row['val1']."','".$row['val2']."')";
    $comma = ",";
    $commit_count++;
    if ( $commit_count == $commit_limit )
    {
        mysql_query($SQL);
        $SQL = "INSERT INTO mytable2 (col1, col2) VALUES ";
        $commit_count = 0;    
        $comma = "";
    }
} 
if ( $commit_count > 0 ) { mysql_query($SQL); }

您可以将$commit_limit更改为合理的正数。

答案 1 :(得分:3)

您应该考虑使用INSERT ... SELECT语句,而不是运行很多单个插入。