计算java中多个文件/文档中的单词频率

时间:2012-11-21 12:24:41

标签: java words word-frequency

我想将单词频率计算到java中的多个文件/文档。

e.g。

a1 = {aaa,aaa,aaa,bbb}
a2 = {aaa, aaa, hhh}
a3 = {aaa, hhh, bbb, bbb}

所以,我想计算每个文件的字频率:

for a1 file {aaa = 3, bbb = 1}
for a2 file {aaa = 2, hhh = 1}
for a3 file {aaa = 1, hhh = 1, bbb =2}

我有一种方法可以从file读取单词,然后将<wordname, worcount>存储在LinkedHashMap中。不过,它会计算所有文件的特定单词的频率,但我想分别计算每个文件的单词频率。

有人有任何解决方案吗?


然后,我写了这个:

Set mapset = fileToWordCount.keySet();           

for(Object filenameFromMap: mapset){
      System.out.println("FILENAME::"+filenameFromMap);
}

但是,它不会打印任何内容。

2 个答案:

答案 0 :(得分:4)

您可以创建另一个Map,将文件名映射到带有字数的LinkedHashMap。所以你会有这样的事情:

Map<String, LinkedHashMap<String, Integer>> fileToWordCount = new HashMap<String, LinkedHashMap<String, Integer>();

然后,对于每个文件,您将像往常一样构建单词频率,并以这种方式向地图添加值:

fileToWordCount.put(file.getPath(), wordCountMap);

答案 1 :(得分:0)

import java.io. ; import java.util。;

public class file1{
 public static void main(String[] args) throws Exception{
HashMap<String,Integer> words_fre = new HashMap<String,Integer>();
HashSet<String> words = new HashSet<String>();
try{  

       File folder = new File("</file path>");
       File[] listOfFiles = folder.listFiles();

       BufferedReader bufferedReader=null;
       FileInputStream inputfilename=null;
       BufferedWriter out= new BufferedWriter(new OutputStreamWriter(new FileOutputStream("outfilename.txt",false), "UTF-8"));

        for(File file : listOfFiles){           
           inputfilename= new FileInputStream(file); 
           /*System.out.println(file); */    
           bufferedReader= new BufferedReader(new InputStreamReader(inputfilename, "UTF-8"));


             String s;
             while((s = bufferedReader.readLine()) != null){
               /*System.out.println(line);*/
                  s = s.replaceAll("\\<.*?>"," ");
                    if(s.contains("॥") || s.contains(":")|| s.contains("।")|| 
                     s.contains(",")|| s.contains("!")|| s.contains("?")){
                         s=s.replace("॥"," ");
                         s=s.replace(":"," ");
                         s=s.replace("।"," ");
                         s=s.replace(","," ");
                         s=s.replace("!"," ");
                         s=s.replace("?"," ");
                       }                                                   
                  StringTokenizer st = new StringTokenizer(s," ");
                  while (st.hasMoreTokens()) {         
                  /*out.write(st.nextToken()+"\n");*/
                  String str=(st.nextToken()).toString();
                  words.add(str);
                }
                for(String str : words){
                  if(words_fre.containsKey(str)){  
                           int a = words_fre.get(str);  
                           words_fre.put(str,a+1);             
                  }else{  
                      words_fre.put(str,1);/*uwords++;//unique words count */  
                  }                      
                }
                words.clear(); 

                  /*out.write("\n");
                  out.close();*/

             }             
             Object[] key =   words_fre.keySet().toArray();   
                  Arrays.sort(key);  
                  for (int i = 0; i < key.length; i++) {  
                    //System.out.println(key[i]+"= "+words_fre.get(key[i]));
                 out.write(key[i]+" : "+words_fre.get(key[i]) +"\n");
               }


         }

            out.close();
            bufferedReader.close();

      }catch(FileNotFoundException ex){
         System.out.println("Error in reading line");
        }catch(IOException ex){
            /*System.out.println("Error in reading line"+fileReader );*/
            ex.printStackTrace();
           }

} }