I think the problem here is that you haven’t downloaded the compatible winutils.exe version for your platform.
I would suggest you start from scratch and follow these steps:
Download winutils.exe from the repository to some local folder, e.g. C:\hadoop\bin.
Set HADOOP_HOME to C:\hadoop.
Create c:\tmp\hive directory (using Windows Explorer or any other tool).
Open the command prompt with admin rights.
Run C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive
After this, you may still get some warnings but I am sure that you will encounter zero errors. And your Spark applications will run fine.
If you want to know more about Hadoop, then do check out this awesome video tutorial: