You can do this the following ways:
Using a mixture of SSIS (SQL Server Integration Services) and DB creation scripts. This will get your data and all missing metadata that is not transferred by SSIS. This is also very simple. First transfer data using SSIS (see instructions below), then create DB Create a script from SQL Azure database, and replay it on your local database.
It is very simple. First, go by a step explained above, then design DB Creation script (right click on the database in SSMS, choose to Generate Scripts -> Database Create). Then re-play this script on your local database.
Using SSIS (SQL Server Integration Services). It only imports data in your table. Column properties, constraints, keys, indices, stored procedures, triggers, security settings, users, logins, etc. are not transferred. However, it is a very simple process and can be done simply by going through a wizard in SQL Server Management Studio.
In SQL Server Management Studio (SSMS) create a new empty database on your local SQL instance.
Choose Import Data from the context menu (right-click the database -> Tasks -> Import data...)
Write in connection parameters for the source (SQL Azure). Choose ".Net Framework Data Provider for SqlServer" as a provider.
Choose existing empty local database as the destination.
Follow the wizard -- you will be able to select tables data you want to copy. You can choose to skip any of the tables you don't need. E.g. if you keep application logs in the database, you probably don't need it in your backup.
You can automate it by creating an SSIS package and re-executing it any time you like to re-import the data. Note that you can only import using SSIS to a clean DB, you cannot do incremental updates to your local database once you are already done it once.
Lastly, you can apply Import/Export service in SQL Azure. This transfers data (with a schema object) to Azure Blob Storage as a BACPAC. You will require an Azure Storage account and perform this in Azure web portal. This is as easy as pushing an "Export" key in the Azure web portal when you select the database you want to export. The downside is that it is the only manual procedure, I don't know a way to automate this through tools or scripts -- at least the first part that requires a click on the web page.
The import process builds a new DAC in two stages:
1) The import creates a new DAC and associated database using the DAC definition stored in the export file, the same way a DAC deploy creates a new DAC from the definition in a DAC package file.
2)The import bulk copies in the data from the export file.
When you import a DAC to an instance of the Database Engine, the imported DAC is included into the SQL Server Utility the next time the utility collection set is transferred from the instance to the utility control point. The DAC will then be present in the Deployed Data-tier Applications node of the Management Studio Utility Explorer and reported in the Deployed Data-tier Applications details page.