Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in BI by (17.6k points)

How do I get 4 million rows and 28 columns from Python to Tableau in a table form?

I assume (based on searching) that I should use a JSON format. This format can handle a lot of data and is fast enough.

I have made a subset of 12 rows of the data and tried to get it working. The good news is: it's working. The bad news: not the way I want to.

My issue is that when I import it in Tableau it doesn't look like a table. I have tried the variances which are displayed here.

This is the statement in Python (pandas):

jsonfile = pbg.to_json("//vsv1f40/Pricing_Management$/Z-DataScience/01_Requests/Marketing/Campaign_Dashboard/Bronbestanden/pbg.json",orient='values')

Maybe I select too many schemas in Tableau (I select them all), but I think my problem is in Python. Do I need to use another library instead of Pandas? Or do I need to change the variables?

Other ways are also welcome. I have no preference for JSON, but I thought that was the best way, based on the search results.

Note: I am new to python and tableau :) I use python 3.5.2 and work in Jupyter. From Tableau I only have the free trial desktop version. 

closed

1 Answer

0 votes
by (47.2k points)
selected by
 
Best answer
  • For certain types of data, JSON is good but if your DataFrame is purely tabular with no MultiIndexes, complex objects, etc and contains simple data types (strings, digits, floats), then  the best format to use is a comma-separated value (CSV) text file as it would take up the least space. 

  • Using  to_csv() method, a data frame can easily be saved as a CSV, and there are a number of customization options available. I'm not terribly familiar with Tableau, but according to their websiteCSV files are a supported input format.

Related questions

0 votes
1 answer
0 votes
1 answer
asked Jul 19, 2019 in BI by Vaibhav Ameta (17.6k points)
0 votes
1 answer

Browse Categories

...