Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
7 views
in Data Science by (17.6k points)

I would like to read several csv files from a directory into pandas and concatenate them into one big DataFrame. I have not been able to figure it out though. Here is what I have so far:

import glob

import pandas as pd

# get data file names

path =r'C:\DRO\DCL_rawdata_files'

filenames = glob.glob(path + "/*.csv")

dfs = []

for filename in filenames:

    dfs.append(pd.read_csv(filename))

# Concatenate all data into one DataFrame

big_frame = pd.concat(dfs, ignore_index=True)

I guess I need some help within the for loop???

1 Answer

0 votes
by (41.4k points)

You can use the below code where header=0 means that first row can be assigned as the column names after reading csv. 

import pandas as pd

import glob

path = r'C:\DRO\DCL_rawdata_files' # use your path

all_files = glob.glob(path + "/*.csv")

li = []

for filename in all_files:

    df = pd.read_csv(filename, index_col=None, header=0)

    li.append(df)

frame = pd.concat(li, axis=0, ignore_index=True)

If you have more doubts, you can check this article on python for more insight.

Related questions

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...