Overview
W-2 no C2C Must work in Pacific Time Zone Aquent is proud to partner with a global leader in innovation, a company that constantly explores potential, breaks barriers, and pushes the edges of what's possible. This organization thrives on diversity, imagination, and a collective drive to make things better. As a **Lead Data Engineer**, you will be at the forefront of shaping the future of data and analytics, directly influencing critical decisions and driving the evolution of our client's data architecture. Your expertise will not only enhance the velocity and quality of data pipelines but also empower teams to deliver groundbreaking solutions that redefine the industry. **About the Role:** We are seeking a visionary Lead Data Engineer to join a highly motivated, global team dedicated to building cutting-edge Data and Analytic solutions for a prominent enterprise. This is a hands-on leadership role where you will define development standards, frameworks, and best practices, significantly impacting the efficiency and quality of data engineering efforts. You will be instrumental in designing and developing critical data pipelines, streamlining architecture, and ensuring data quality and reliability across data lakes and warehouses. This role offers an exciting opportunity to lead, mentor, and innovate within a dynamic environment. **Key Responsibilities:** * Define and communicate technical environment requirements, determine project scope, and provide technical estimates or capacity planning.
* Translate product backlog items into robust engineering designs and logical units of work.
* Lead the development of technical solutions that align with architectural standards and meet business needs.
* Drive collaboration with architecture and platform teams on integration needs and designs, creating advanced technical designs and reviewing proof-of-concept efforts.
* Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile/Scrum methodologies.
* Define and apply appropriate data acquisition, processing, and consumption strategies for various technical scenarios.
* Design and implement distributed data processing pipelines using industry-standard tools and languages.
* Profile and analyze data to design scalable solutions.
* Drive technical strategies for new data projects and optimize existing solutions.
* Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational challenges.
* Build utilities, user-defined functions, libraries, and frameworks to enhance data flow patterns and implement complex automated routines using workflow orchestration tools.
* Lead collaborative reviews of design, code, test plans, and dataset implementations to uphold data engineering standards.
* Identify and remove technical bottlenecks for your engineering squad, providing leadership, guidance, and mentorship to other data engineers.
* Anticipate, identify, and resolve data management issues to improve data quality.
* Build and incorporate automated unit tests and participate in integration testing efforts.
* Utilize and advance software engineering best practices, including source control, code review, testing, and continuous integration/delivery (CI/CD) on cloud infrastructure. **Qualifications:** **Must-Have Skills & Experience:** * Bachelor's degree in computer science, data science, software engineering, or a related field, or an equivalent combination of education, experience, and training.
* Minimum 6 years of relevant work experience in designing and implementing innovative data engineering capabilities and end-to-end solutions.
* Advanced experience with data modeling, warehousing, and building ETL pipelines, including experience with ETL tools like Matillion and/or PySpark.
* Expertise in building and operating highly available, distributed systems for data extraction, ingestion, and processing of large datasets, with the ability to deliver end-to-end projects independently.
* Advanced experience building cloud-scalable, real-time, and high-performance data lake solutions, preferably with Databricks, Snowflake, and/or AWS.
* Advanced experience with big data technologies such as Hadoop, Hive, Spark, EMR, and orchestration tools like Airflow.
* Advanced proficiency in SQL and modern scripting or programming languages, such as Python and Shell.
* Experience in CI/CD Pipeline for Code deployment, with exposure to tools like GitHub, Jenkins, Terraform, and Databricks Assets Bundles.
* Strong problem-solving and interpersonal communication skills.
* Demonstrated ability to deliver results on multiple projects in a fast-paced, agile environment.
* Strong desire to learn, share knowledge, and coach team members. **Nice-to-Have Skills & Experience:** * Certifications in Databricks and/or AWS.
* Experience with Matillion (ETL).
* Experience with data migration projects, particularly from Snowflake to Databricks.
* Leadership behavior and the ability to work independently.
* Familiarity with best practices around documentation. **About Aquent Talent:** Aquent Talent connects the best talent in marketing, creative, and design with the world's biggest brands.
Our eligible talent get access to amazing benefits like subsidized health, vision, and dental plans, paid sick leave, and retirement plans with a match. More information on our awesome benefits!
Aquent is an equal-opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. We're about creating an inclusive environment-one where different backgrounds, experiences, and perspectives are valued, and everyone can contribute, grow their careers, and thrive.
|