Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
3 views
in Azure by (13.1k points)
I have a dataframe which is store in deltalake. So, is there any command available to list all the dataframes which are present in deltalake using databricks?

Can anyone help me with this?

1 Answer

0 votes
by (26.7k points)

You can try using path of tables, you can list the table using this command:

spark.conf.get("spark.sql.warehouse.dir") + s"/$tableName"

Also, if you are using any external tables then you can use this command:

catalog.getTableMetadata(ident).location.getPath

I hope this will work.

Want to become an Azure expert? join azure architect certification now!!

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...