There are two kinds of approaches to do this
Approach 1: Extract your data.
Especially when it comes to data sources that are pay per query byte, (Big Query, Athena, Etc) extracts make a great deal of sense. It depends on how 'fresh' the data must be for the users. (Of course, all users will say 'live is the only way to go', but dig into this a little and see what it might actually be.) Refreshes can be scheduled for as little as 15 minutes. The real power of refreshes comes in the form of 'incremental refreshes' whereby only new records are added (along an index of int or date.) This is a great way to reduce costs - if your BigQuery database is partitioned - (which it should be.) Since Tableau Extracts are contained within .hyper files, a structure of Tableau's own design/control, they are extremely fast and optimized perfectly for use in Tableau.
Approach 2: Create 3 Data Sources (or more.)
Certify these data sources after validating that they provide correct information. Provide users with clear descriptions.
Original Large Dataset.
The subset of ~20 fields for 90%.
The remainder of fields for the 10%
Extract of 1
Extract of 2
Extract of 3
Importantly, if field names match in each data source (ie: not changed manually ever) then it should be easy for a user to 'scale up' to larger datasets as needed. This means that they could generally always start out with a small subset of data to begin their exploration, and then use the 'replace datasource' feature to switch to a different data source while keeping their same views. (This wouldn't work as well if at all for scaling down, though.)