如何提高PL / SQL UTL_FILE的性能

时间:2014-10-10 07:20:37

标签: sql oracle plsql

我需要从oracle数据库导出多达2亿个blob到基于linux的目录作为图像。将映像导出到目录后,Oracle Web Center批处理器会将这些映像摄取到oracle Web中心内容。下面的代码试图实现主要目标。

  1. 从给定表中选择blob列,然后将这些blob列提取到平面文件
  2. 然后将所有导出的图像记录到image_log3表中。所有image_log3表内容都将用于生成批处理器输入文件。
  3. 所有失败的图像都记录在image_null_log3表中,从开始到结束导出图像所用的所有时间都记录在表time_log中。
  4. 问题:

    代码运行时间太长。目前要导出100000个blob并写入文件需要10.9,要导出10000需要0.5。

    我已经在使用批量收集,优化器提示。如何改进此代码以实现更快的性能,以便能够将尽可能多的图像导出和写入平面文件?

    请咨询

    SET SERVEROUTPUT ON
    
    CREATE OR REPLACE PACKAGE BODY OWCMIGRATION.biometric_blob_to_file_april28
     AS
    
        /*
        -------------------------------------------------------------------------------------------------------------------------------------
        Procedure Name: write_bio_blob_to_file_mar26
        Procedure Description: This procedure is used to convert blob data into image file.
        -------------------------------------------------------------------------------------------------------------------------------------
        */
    
        PROCEDURE write_bio_blob_to_file_mar26( ls_pin_ref VARCHAR2, ls_msisdn VARCHAR2,l_blob BLOB, ls_file_name VARCHAR2)
        AS
    
           lclob_file              CLOB;
           ln_cursor_cnt        NUMBER := 0;
           t_blob                  BLOB;
           l_file               UTL_FILE.FILE_TYPE;
           t_TotalSize          NUMBER;
           t_file_name         VARCHAR2 (100);
           t_remain             NUMBER;
           t_position            NUMBER := 1;
           l_buffer               RAW(32760);--VARCHAR2(32760);--
           l_amount             BINARY_INTEGER := 32760;
           err_code              VARCHAR2(10);
           err_msg               VARCHAR2(250);
    
    
        BEGIN
    
            DBMS_LOB.createtemporary (lclob_file, TRUE);
    
            --Incrementing the cursor count and the commit count for eacj record being processed.
              ln_cursor_cnt := ln_cursor_cnt + 1;
    
            --Calculationg length and getting file name of the image*/
    
             SELECT DBMS_LOB.getlength  (l_blob)    BLOB_BASE64,    ls_file_name
                INTO t_TotalSize, t_file_name
                FROM DUAL
                WHERE nvl(length(dbms_lob.getlength(l_blob)),0)<>0; 
    
    
            --If length of blob is not null then inserting into image log table for generating config file*/   
                IF t_TotalSize <>0 THEN
    
                    INSERT /*+ APPEND PARALLEL */ INTO Image_Log3
                        VALUES(ls_pin_ref,ls_msisdn,ls_file_name);
    
                    COMMIT;
    
                END IF;
    
            --Opening File in Write mode in directory migration2
            l_file :=      UTL_FILE.fopen ('ECM_MIGRATION_QA',  t_file_name, 'wb',  32767); --BIO_MIGRATION_QA
              t_remain := t_TotalSize;
    
            --Converting Blob into Images
    
    --         IF t_TotalSize < l_amount THEN
    --         dbms_output.put_line('0 blob:'||ls_file_name);
    --         
    --          dbms_output.put_line('length:'||t_TotalSize);
    --                UTL_FILE.put_raw (l_file, l_buffer);
    --                UTL_FILE.fflush (l_file);
    
    --         ELSE
                 WHILE t_position < t_TotalSize
                  LOOP
                     BEGIN
                     dbms_output.put_line(' blob:'||ls_file_name);
                        DBMS_LOB.read (l_blob, l_amount, t_position,  l_buffer);
                        UTL_FILE.put_raw (l_file, l_buffer);
                        UTL_FILE.fflush (l_file);
                        t_position := t_position + l_amount;
                        t_remain := t_remain - l_amount;
    
                        IF t_remain < 32767
                        THEN
                           l_amount := t_remain;
                        END IF;
    
                     EXCEPTION
    
                        WHEN NO_DATA_FOUND
                        THEN
                           NULL;
                     END;
    
                  END LOOP;
    
    --         END IF;
    
           UTL_FILE.FCLOSE(l_file);
           DBMS_LOB.FREETEMPORARY (lclob_file);
    
        EXCEPTION
    
           --Exception to handle Invalid Path
           WHEN UTL_FILE.Invalid_path
               THEN
                  DBMS_OUTPUT.put_line ('Invalid Dir');
                  DBMS_OUTPUT.put_line ('Directory Error: '||ls_file_name );            
    
           --Exception to handle Invalid Operation
           WHEN UTL_FILE.Invalid_operation
               THEN
                  DBMS_OUTPUT.put_line ('Invalid operation');
                  DBMS_OUTPUT.put_line ('File Error write_bio_blob_to_file: '||ls_file_name ); 
    
            WHEN OTHERS
                THEN
                    err_code := SQLCODE;
                    err_msg := SUBSTR(SQLERRM, 1, 200);
                  DBMS_OUTPUT.put_line ('Others ');
                  DBMS_OUTPUT.put_line ('File Error others: '||ls_file_name ); 
    
                  INSERT /*+ APPEND PARALLEL */ INTO image_null_log2
                    VALUES(ls_pin_ref,ls_msisdn,ls_file_name,l_blob,err_code,err_msg);
    
              COMMIT;
    
        END write_bio_blob_to_file_mar26;
    
        /*
        -------------------------------------------------------------------------------------------------------------------------------------
        Procedure Name: exp_bio_blob_to_file_mar26
        Procedure Description : This Procedure is used to export images.
        -------------------------------------------------------------------------------------------------------------------------------------
        */
        PROCEDURE exp_bio_blob_to_file_mar26
        AS
    
    
            TYPE BioCurTyp IS REF CURSOR;
            bio_cv BioCurTyp; 
    
            -- Table type that can hold information about owc_isl_data_test
            TYPE BioList_tab IS TABLE OF CUSTOMER_BIOMETRIC%ROWTYPE;
            BiometricInfo BioList_tab;
    
            start_time    number;
            end_time    number;
            l_diff        number;
    
        BEGIN
    
           --Truncating Image Log1 table
           EXECUTE IMMEDIATE  'truncate table Image_Log3';
           EXECUTE IMMEDIATE  'truncate table Image_null_log3';
    
          start_time := DBMS_UTILITY.get_time;
    
          --create a cursor that select all biometric data
          OPEN bio_cv FOR
                        SELECT /*+ FIRST_ROWS*/ *
                           FROM CUSTOMER_BIOMETRIC
                                WHERE rownum<1000000;
    
    
          --fetching biometric data from cursor in bulk of 1000
          FETCH bio_cv BULK COLLECT INTO BiometricInfo LIMIT 700000;
            CLOSE bio_cv;
    
            /* Export images from owc_isl_data_test table then write correspoding image to directory 
            to MIGRATION_2 which I mapped to ojecmtst01 path /ecm/digital/migration */
    
            FOR i IN BiometricInfo.FIRST .. BiometricInfo.LAST LOOP
    
              write_bio_blob_to_file_mar26 (BiometricInfo(i).pinref,BiometricInfo(i).MSISDN,BiometricInfo(i).PHOTO_PERSON,BiometricInfo(i).MSISDN ||'_PT_05.jpg');
              write_bio_blob_to_file_mar26 (BiometricInfo(i).pinref,BiometricInfo(i).MSISDN,BiometricInfo(i).FINGPRNT_LEFT_THUMB,BiometricInfo(i).MSISDN ||'_LT_05.wsq');
              write_bio_blob_to_file_mar26 (BiometricInfo(i).pinref,BiometricInfo(i).MSISDN,BiometricInfo(i).FINGPRNT_LEFT_INDEX,BiometricInfo(i).MSISDN ||'_LF_05.wsq');
              write_bio_blob_to_file_mar26 (BiometricInfo(i).pinref,BiometricInfo(i).MSISDN,BiometricInfo(i).FINGPRNT_RIGHT_THUMB,BiometricInfo(i).MSISDN ||'_RT_05.wsq');
              write_bio_blob_to_file_mar26 (BiometricInfo(i).pinref,BiometricInfo(i).MSISDN,BiometricInfo(i).FINGPRNT_RIGHT_INDEX,BiometricInfo(i).MSISDN ||'_RF_05.wsq');
    
              --write_bio_blob_to_file_mar26 (BiometricInfo(i).pin_ref,BiometricInfo(i).MSISDN,BiometricInfo(i).PASSPORT_PHOTO,BiometricInfo(i).msisdn||'_PP_05.jpg');
              --write_bio_blob_to_file_mar26 (BiometricInfo(i).pin_ref,BiometricInfo(i).MSISDN,BiometricInfo(i).CER_IMG_DATA,BiometricInfo(i).msisdn||'_CI_05.jpg');
    
            END LOOP;
    
    
            COMMIT;
    
            end_time := DBMS_UTILITY.get_time;
            l_diff := (end_time-start_time)/100;
    
            INSERT INTO time_log 
                VALUES(l_diff);
    
            DBMS_OUTPUT.PUT_LINE('start time ' || to_char(start_time));
            DBMS_OUTPUT.PUT_LINE('end time ' || to_char(end_time));
            DBMS_OUTPUT.PUT_LINE('Elapsed  time ' || l_diff ||'sec'); 
            DBMS_OUTPUT.put_line('end loop');
    
        EXCEPTION
    
        --Exception to handle Invalid Path
            WHEN UTL_FILE.Invalid_path
                THEN
                    DBMS_OUTPUT.put_line('Invalid Dir');
    
            --Exception to handle Invalid Operation
            WHEN UTL_FILE.Invalid_operation
                THEN
                    DBMS_OUTPUT.put_line('Invalid operation');     
             WHEN OTHERS
                 THEN
                    DBMS_OUTPUT.put_line ('Others '||SQLCODE||SUBSTR(SQLERRM, 1, 200)); 
    
        END exp_bio_blob_to_file_mar26;                                                            
    
    
    End biometric_blob_to_file_april28;
    /
    

    回答Ben的问题

    我不确定什么是最好的平行方式。我尝试了并行性,因为将记录分成10万个范围,但无法使其正常工作。我做了以下

    1. 在同一个包中,我创建了名为exp_bio_blob_to_file_1_mar26的过程。此过程类似于exp_bio_blob_to_file_mar26,但以下SQL

      除外

      - 选择100 001到200000之间的记录 /       打开bio_cv FOR                 选择MSISDN,
                                  PINREF,                             PHOTO_PERSON,                             FINGPRNT_LEFT_THUMB,                             FINGPRNT_LEFT_INDEX,                             FINGPRNT_RIGHT_THUMB,                             FINGPRNT_RIGHT_INDEX
                        FROM(选择a。
      ,ROWNUM rnum                           FROM(SELECT / + FIRST_ROWS / *                                   来自CUSTOMER_BIOMETRIC)a                           - 选择范围内记录的结束点                          REREUM&lt; = 200000)                             - 选择范围内记录的起始点                     在哪里rnum&gt; = 100001;

    2. 上设置serveroutput

      DECLARE  jobnumber33号码;  jobnumber44号码;

      start_time号码;  end_time号码;  l_diff号码; 开始   start_time:= DBMS_UTILITY.get_time;    dbms_job.submit(jobnumber33,'begin biometric_blob_to_file_mar28.exp_bio_blob_to_file_mar26; dbms_output.put_line(''test1''); end;');    承诺;    dbms_job.submit(jobnumber44,'begin biometric_blob_to_file_mar28.exp_bio_blob_to_file_1_mar26; dbms_output.put_line(''test2''); end;');    承诺;   dbms_job.run(jobnumber33);   DBMS_OUTPUT.PUT_LINE('已提交作业'|| jobnumber33);

      dbms_job.run(jobnumber44);   DBMS_OUTPUT.PUT_LINE('提交的作业'|| jobnumber44);

      end_time:= DBMS_UTILITY.get_time;    l_diff:=(end_time-start_time)/ 100;    DBMS_OUTPUT.PUT_LINE('经过时间'|| l_diff ||'秒');

      END; /

    3. 第一份工作将被执行罚款,但第二份工作将直接进入“其他”部分而不执行。我不知道为什么..我正在尝试创建2个不同的表,每个表都包含要导出的单独记录范围。

      回答杰弗里的问题

      1. 磁盘IO可能是瓶颈。我会要求Linux团队提供磁盘IO统计数据,但是一旦磁盘IO排除不成为瓶颈,我就可以获得并行性
      2. 我刚刚获得了下面的磁盘io统计信息。我看起来像磁盘io不是瓶颈

        root@owcm880 # zpool iostat owcmigration 3
                         capacity     operations    bandwidth
        pool          alloc   free   read  write   read  write
        ------------  -----  -----  -----  -----  -----  -----
        owcmigration  6.20G   289G      0      5    393   140K
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0     65      0   881K
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        owcmigration  6.20G   289G      0      0      0      0
        

0 个答案:

没有答案
相关问题