Sr Data Engineer-Xumo (Apply in minutes) Job at Comcast Corporation, Irvine, CA

YWRGcWFZVEU0U3FOd3VTa0ZoQThJN2NpOFE9PQ==
  • Comcast Corporation
  • Irvine, CA

Job Description

Comcast brings together the best in media and technology. We drive innovation to create the worlds best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. **Job Summary** We are looking for a talented Senior Data Engineer to play a crucial role in shaping our data and recommendation strategy and empowering data-driven decision-making across the organization. This role will involve analyzing, designing and implementing data solutions that provide actionable insights into our Video Streaming Application, including deep dives into user behavior, content performance, and the effectiveness of our recommendation engine. Xumo, a joint venture between Comcast and Charter, was formed to develop and offer a next-generation streaming platform on a variety of branded 4K streaming devices and smart TVs. Powered by Comcast’s global technology platform, Xumo devices and services feature an entertainment experience designed to make it easy for consumers to find and enjoy their favorite streaming content through a world-class user interface and voice search, and for partners to meaningfully connect and engage with millions of consumers. **Job Description** As a Senior Data Engineer, you will be a key contributor in building and maintaining a scalable and reliable data infrastructure that supports our business goals. You will collaborate with cross-functional teams to translate business requirements into technical solutions, ensuring data integrity and accessibility. Your expertise in data analysis & modeling, ETL processes, and data warehousing will be instrumental in driving our organizations success through data-driven insights. Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelersanalysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise.

  • **Position is office based in Irvine, CA- 4 daysweek & 1 dayweek remote**.
  • *_Position Duties:_**
  • *Lead Data Engineering and Architecture**
  • Collaborate with product, engineering, and data science teams to define data requirements and translate them into scalable data solutions, with a focus on supporting statistical analysis and performance reporting.
  • Design, develop, and maintain robust data pipelines and ETL processes to ingest, transform, and load data from various sources, ensuring data integrity and availability for performance analysis.
  • Architect and implement data models and schemas optimized for performance, scalability, and analytical queries, enabling efficient statistical analysis and reporting.
  • Build and maintain data infrastructure components, including data warehouses, data lakes, and real-time streaming systems, to support the needs of performance monitoring and reporting.
  • *Develop and Maintain Data Infrastructure**
  • Work closely with backend engineers to optimize data storage and retrieval mechanisms for maximum efficiency, particularly for performance-critical data.
  • Implement and maintain data quality checks and monitoring systems to ensure data accuracy and reliability, crucial for accurate performance analysis and reporting.
  • Develop and maintain comprehensive documentation for data architecture, pipelines, and processes, including guidelines for performance testing and optimization.
  • Continuously evaluate and adopt new technologies and best practices to improve data infrastructure performance and scalability, supporting the growing needs of performance analysis.
  • *Enable Data-Driven Decision-Making**
  • Partner with data scientists and analysts to ensure data accessibility and usability for advanced analytics and reporting, including performance analysis and statistical modeling.
  • Develop and maintain tools and frameworks for data exploration, visualization, and analysis, specifically tailored for performance monitoring and reporting.
  • Provide technical expertise and guidance to stakeholders on data-related inquiries and best practices, particularly in the areas of performance analysis and reporting.
  • Contribute to the development and implementation of data governance policies and procedures, ensuring data quality and consistency for performance analysis.
  • *_Qualifications_**
  • 8+ years of experience in data engineering, data analysis, data warehousing, or a related field.
  • 5+ years of experience with SQL and working with relational databases (e.g., MySQL, PostgreSQL).
  • Strong experience with big data technologies (e.g., Hadoop, Spark) and cloud-based platforms (e.g., AWS, Google Cloud, Azure).
  • Proficiency in building and maintaining data pipelines using ETLELT tools.
  • Experience with data modeling and schema design for data warehouses and data lakes.
  • Experience with data governance and compliance standards.
  • Proven ability to translate business requirements into scalable data solutions.
  • Strong communication skills and ability to collaborate with cross-functional teams.
  • Self-starter with excellent organizational and problem-solving skills.
  • *_Highly Preferred Experience:_**
  • Experience with data analysis using Google BigQuery.
  • Experience with data visualization tools such as Sisense, Tableau, Power BI, or Looker.
  • Experience with programming languages for data analysis (e.g., Python, R).
  • Experience with predictive analytics and machine learning algorithms.
  • Familiarity with business domains such as customer analytics, marketing, or operations.
  • Experience working with data lakes and data warehouse technologies (e.g., Snowflake, Redshift).
  • *Core Responsibilities**
Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrationsconversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate,

Job Tags

Full time, Local area,

Similar Jobs

Denise Granville - State Farm Agency

Entry Level Sales Job at Denise Granville - State Farm Agency

 ...Granvilles State Farm team in San Bruno! No insurance experience? No problem well train you to...  ..., and eager to grow, this is the entry-level opportunity for you! Job Description:...  ...insurance license required ~ Life and Health insurance license (must be able to obtain... 

Dexian

Bookkeeper Job at Dexian

&##128204; Administrative Bookkeeper | Construction / Real Estate Development &##128205; Charlotte, NC | Fully Onsite Were partnered with a growing family-owned construction & real estate development company looking to hire an Administrative Bookkeeper to support... 

Staffing the Universe

WordPress Developer Job at Staffing the Universe

 ...WordPress Developer Job Job Location: Remote (offshore) Responsibilities: 3+ years proven business experience in modern Front end development and deployment. Advanced experience in Wordpress and utilizing Gutenberg Advanced experience in another CMS Advanced... 

Hope Grows Corp

Remote Data Entry Specialist Job at Hope Grows Corp

Whether you are looking for regular, ongoing lawn/landscape services or help with a one-time project, Hope Grows is ready to keep your property looking beautiful. Our trained, uniformed professionals deliver first-class lawn/landscaping services to the north side of...

Disinfectant Solutions of Atlanta

Custodian Night Shift 6PM to 10PM Job at Disinfectant Solutions of Atlanta

 ...Job Posting: Custodian Night Shift (6PM to 10PM) - Disinfectant Solutions of Atlanta Job Description Disinfectant Solutions of Atlanta, a leader in professional cleaning and sanitation services across the city, is excited to announce the availability of a full-time...