Position: Azure Databricks Data Engineer
Location: Chicago IL(Onsite)
Duration: 6months
Job Description:
Design/Development: You will design and support the business s database and table schemas for new and existing data sources for the Lakehouse. Create and support the ETL in order to facilitate the movement of data into the Lakehouse.
Through your work you will ensure
The data platform is scalable for large amounts of data ingestion and processing without service degradation.
Processes are built for monitoring and optimizing performance
Implement Chaos Engineering practices and measures that allow the endtoend infrastructure to function as expected even if individual components fail
Collaboration : You will be collaborative working closely with Product Owners application engineers and other data consumers within the business in an attempt to gather and deliver high quality data for businesscases. Work closely with other disciplines/departments and teams across the business in coming up with simple functional and elegant solutions that balance data needs across the business
Analytics : You will play an analytical role in quickly and thoroughly analyzing business requirements and subsequently translating the emanating results into good technical data designs. Document the data solutions develop and maintain technical specification documentation for all reports and processes.
Skills you MUST have:
6 years proven ability of professional Data Development experience
3 years proven ability of developing with Azure Databricks or Hadoop/HDFS
3 years of experience with PySpark/Spark
3 years of experience with SQL
3 years of experience developing with Python
Full understanding of ETL concepts and Data Warehousing concepts
Data modeling and query optimization skills implementation experience of Data Vault Star Schema and Medallion architecture
Experience with CI/CD
Experience with version control software
Strong understanding of Agile Principles (Scrum)
Experience with Azure
Experience with Databricks Delta Tables Delta Lake Delta Live Tables
Bonus Points For Experience In The Following
Proficient with Relational Data Modeling
Experience with Python Library Development
Experience with Structured Streaming (Spark or otherwise)
Experience with Kafka and/or Azure Event Hub
Experience with GitHub SaaS / GitHub Actions
Experience with Snowflake/ Exposure to BI Tooling (Tableau Power BI Cognos etc.)
Mandatory skills:
Azure Databricks
Pyspark
Azure Data Factory
azure databricks
...Requirements: Able to lift up to 40 lbs, stand for extended periods, and walk up to 5+ miles daily. Comfortable living and working in a remote, rural community. Willing to travel by small plane or ferry. Able to work in varying weather conditions on gravel roads....
...Job Title: Transaction Coordinator Location: Charlotte, NC (Hybrid 3 days in office, 2 days remote) Contract Duration: 3-Month Contract (Possible... ...Maintain high accuracy and meet daily volume targets. Work collaboratively with local and offshore teams (India,...
Baltimore Middle School Math Tutor Jobs The Varsity Tutors platform has thousands of students looking for online Middle School Math tutors nationally and in Baltimore. As a tutor who uses the Varsity Tutors platform, you can earn good money, choose your own hours, and...
We're seeking a passionate and experienced Director of Product to join our rapidly growing team! We're a venture-backed startup using cutting-edge AI and computer vision to revolutionize a traditional industry. In this exciting role, you'll play a key role in shaping the...
...a television news environment, evidencing strong storytelling, live reporting, and news-gathering skills. A college degree in Journalism, Communications, or a related field is preferred. Proficiency with current computer software, newsroom computer systems, smartphone...