Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Python by (1.6k points)

I have a large text file (~7 GB). I am looking if exist the fastest way to read large text file. I have been reading about using several approach as read chunk-by-chunk in order to speed the process.

at example effbot suggest

# File: readline-example-3.py

file = open("sample.txt")

while 1:

    lines = file.readlines(100000)

    if not lines:

        break

    for line in lines:

        pass # do something**strong text**

in order to process 96,900 lines of text per second. Other authors suggest to use islice()

from itertools import islice

with open(...) as f:

    while True:

        next_n_lines = list(islice(f, n))

        if not next_n_lines:

            break

        # process next_n_lines

list(islice(f, n)) will return a list of the next n lines of the file f. Using this inside a loop will give you the file in chunks of n lines

1 Answer

0 votes
by (25.1k points)

Just open the file in a with block to avoid having to close it. Then, iterate over each line in the file object in a for loop and process those lines. e.g.:

with open("file.txt") as f:

    for line in f:

        process_lines(line)

Learn Python with the help of this PPython Certification 

Related questions

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

30.5k questions

32.5k answers

500 comments

108k users

Browse Categories

...