Essential skills and knowledge for a career in PySpark include proficiency in Python, a strong understanding of Apache Spark's architecture and components, data manipulation and transformation skills, experience with distributed computing, knowledge of big data technologies like Hadoop, and the ability to work with large datasets and perform data analysis and machine learning tasks using PySpark's libraries and APIs. If you are interested in getting into this field, then check out this video about Sucheta Hardikar and how she became a PySpark professional just after completing the Big Data Architect Master's Course from Intellipaat.