This job has expired, please see additional jobs below
Senior Data Integration Engineer
Comcast
West Chester, PA, United States
Job Details - this job has expired, please see similar jobs below
Position Summary
The DS group has responsibility for Data Services for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating the enterprise data sources, EDW, ODS, transaction data source as a single version of truth for data analysis for Comcast business.
The Engineer will provide ETL solutions to answer technically challenging business requirements (complex transformations, high data volume ).The ideal candidate will have a deep understanding of technical and functional designs for ETL, Database, Data Warehousing, and Reporting areas. This job plays a key role in customer analytics projects, systems design and development. This team is accountable for ensuring that the overall functional and technical design of work processes and applications meet business and functional requirements. In addition, the work must be sustainable and conform to operation infrastructure.
This position will be responsible for designing, developing, testing, tuning and the deployment of software solutions within Hadoop and Teradata Ecosystems.
The Principal Engineer will work closely with the administrators and application teams to insure business applications are highly available and performing within agreed upon service levels. This position will also work closely with business, architects and other development teams in an agile manner to quickly realize business value.
-Play a key role at a senior-level to the ETL and Data Warehouse by implementing a solid, robust, extensible design that supports key business flows;
-Build and maintain optimized ETL solutions to process/load source systems data into Hadoop using Sqoop or Microservices, Teradata using Informatica and other utilities.ยท
Develop, implement and maintain development best practices for within Teradata environments
-Strong experience with Hive, Pig, Flume, Sqoop, Kafka, and Storm
-Work with peers in administration to tune code and plan for capacity needs
-Deliver clear, well-communicated and complete project documents.
-Analyze and solve problems and recommend improvements to existing systems and processes.
-Design, code and test major segments of a system in a timely manner.
-Leads unit, system acceptance, and performance testing by designing test cases, building test data, test execution and evaluation, along with recommending/making improvements/fixes to the system.
-Ensure data security, data quality and governance of data within Hadoop and Teradata ecosystem
-Develop, implement and maintain applications with RDBMS technologies
-Knowledge in data warehousing methodologies and best practices required.
-Strong verbal and written communication skills required.
-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.
-Skills in navigating a large organization in order to accomplish results required.
-Ability to initiate and follow through on complex projects of both short and long term duration required.
-Excellent organizational and time management skills required.
-Excellent analytical and problem solving skills required.
-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.
-Participate on interdepartmental teams to support organizational goals
-Perform other related duties and tasks as assigned
-Punctual, regular, and consistent attendance
Required Skills/Experience:
-Bachelor's Degree is required.
-5 + year of experience working as data integration, data solution architect EDW, ODS, ETL architect or similar role required.
-Five to seven years with data integration, Development of End to End ETL architecture using Linux, Informatica, Teradata, SQL, BTEQ,
-One plus years hands-on experience in developing applications utilizing one more, Hadoop Ecosystem components e.g.Sqoop, Hive, Pig, Flume, Accumulo, HBase, Kafka, Spark and Storm, etc.
-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies
-Ability to work effectively across organizational boundaries
-Excellent oral, written, analytical, problem solving, and presentation skills
-Manage and Co-ordinate 1-3 matrix resources,
-Experience with mange service and on-shore and off-shore development experience is must
Desired Skills/ Experience
-Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. DST, CSG and AMDOC etc.
-Knowledge of NoSQL platforms;
-Hadoop, Teradata, TOGAF Certified
Comcast is an EOE/Veterans/Disabled/LGBT employer