如何以1000行为单位读取大的.txt文件

问题描述

我想重复读取和处理文件中的1000行块,直到文件结束。

Path pp = FileSystems.getDefault().getPath("logs","access.log");
final int BUFFER_SIZE = 1024*1024; //this is actually bytes

FileInputStream fis = new FileInputStream(pp.toFile());
byte[] buffer = new byte[BUFFER_SIZE]; 
int read = 0;
while( ( read = fis.read( buffer ) ) > 0 ){
    // call your other methodes here...
}

fis.close();

解决方法

多年来,我都面临着同样的情况。我的最终解决方案是使用接口列表中的方法.sublist(),您可以使用该方法:

第一步:读取给定文件中的所有行

 String textfileRow = null;
 List<String> fileLines = new ArrayList<String>();
 BufferedReader fileContentBuffer = null;
    fileContentBuffer = new BufferedReader(new FileReader(<your file>));
    while ((textfileRow = fileContentBuffer.readLine()) != null)
    {
       fileLines.add(textfileRow);
    }

第二步:从给定大小的先前创建的列表中创建块

    int final CHUNKSIZE = <your needed chunk size>;
    int lineIndex = 0;
    while (lineIndex < fileLines.size())
    {
        int chunkEnd = lineIndex + CHUNKSIZE;
    
        if (chunkEnd >= fileLines.size())
        {
            chunkEnd = fileLines.size();
        }
        List<Type you need> mySubList = fileLines.subList(lineIndex,chunkEnd);
                
        //What ever you want do to...       
                
        lineIndex = chunkEnd;
    }

在我的项目中,我将其与长达2万行的csv文件一起使用,并且效果很好。

编辑:我在标题中看到了对文本文件的请求,因此我更改了读取文本文件的方式。

,

旧方法:使用BufferedReaderreadLine方法代替原始的FileInputStream

 Path path = // access your path...;
 List<String> buffer = new ArrayList<>();
 try (BufferedReader in = new BufferedReader(new FileReader(path.toFile))) {
    String nextLine;
    do  {
        buffer.clear();
        for (int i=0; i < chunkSize; i++) {
            // note that in.readLine() returns null at end-of-file
            if ((nextLine = in.readLine()) == null) break;
            buffer.add(next);
        }
        processChunk(buffer); // note size may be less than desiredChunkSize  
    } while (nextLine != null);
 } catch (IOException ioe) {
    // handle exceptions here
 }
 // all resources will be automatically closed before this line is reached

较新的方法:使用Files.lines访问懒散的行流:

 Path path = // access your path...;
 final AtomicInteger c = new AtomicInteger();
 Files.lines(path)
      .collect(Collectors.groupingBy(e -> c.getAndIncrement()/chunkSize))
      .forEach(chunk -> processChunk(chunk));
 // all resources will be automatically closed before this line is reached

免责声明:我也没有测试;但是两种方法应该都可以。