I have a dataset that I've loaded into BigQuery, it consists of 3 separate tables with a common identifier in each of the files.
When I set up my project in Tableau I performed an inner join on two of the tables. I set the connection up as an extract and not live.
There's some geo info in my file, lats, and longs. When I drag lat to the rows section on my worksheet it's taking an eternity to perform that task, currently, it's taken 18 mins and counting to just process whatever it's doing when I drag the lat to the row section.
Is there some other way that I can take a random sample of my data for working on it rather than having to wait for each query to process? My data is not even that big, it's around 1M rows.