Data Platform Engineer Resume Samples

4.7 (114 votes) for Data Platform Engineer Resume Samples

The Guide To Resume Tailoring

Guide the recruiter to the conclusion that you are the best candidate for the data platform engineer job. It’s actually very simple. Tailor your resume by picking relevant responsibilities from the examples below and then add your accomplishments. This way, you can position yourself in the best way to get hired.

Craft your perfect resume by picking job responsibilities written by professional recruiters

Pick from the thousands of curated job responsibilities used by the leading companies

Tailor your resume & cover letter with wording that best fits for each job you apply

Resume Builder

Create a Resume in Minutes with Professional Resume Templates

Resume Builder
CHOOSE THE BEST TEMPLATE - Choose from 15 Leading Templates. No need to think about design details.
USE PRE-WRITTEN BULLET POINTS - Select from thousands of pre-written bullet points.
SAVE YOUR DOCUMENTS IN PDF FILES - Instantly download in PDF format or share a custom link.

Resume Builder

Create a Resume in Minutes with Professional Resume Templates

Create a Resume in Minutes
AR
A Runte
Arnold
Runte
870 Koss View
Boston
MA
+1 (555) 159 9267
870 Koss View
Boston
MA
Phone
p +1 (555) 159 9267
Experience Experience
Philadelphia, PA
Big Data Platform Engineer
Philadelphia, PA
Emard Inc
Philadelphia, PA
Big Data Platform Engineer
  • Implementing new services to agreed SLA’s and ensuring transition to support or standardise services to Global services
  • Should believe in You Build! You Ship! And You Run!philosophy
  • Work with engineering support on upgrades of the cluster participating in the installation, configuration and administration of multi-node Big data platforms
  • Define and build lightweight (low-overhead) monitoring system characteristics, in real time, and tools for correlating and analysing those statistics for the Big Data infrastructure to ensure good health of the infrastructure
  • Troubleshoot cluster & ecosystem product issues as a priority with expertise & lead through to resolution working to SLA’s as defined within the Operational environment
  • Passion for software engineering and craftsman-like coding prowess
  • Server Operating systems internals, benchmarking and performance tuning (Linux)
Chicago, IL
Senior Data Platform Engineer
Chicago, IL
Cole-Monahan
Chicago, IL
Senior Data Platform Engineer
  • Develop framework, metrics and reporting to ensure progress can be measured, evaluated and continually improved
  • Assist application development teams during application design and development for highly complex and critical data projects
  • Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Storm, NIFI and Kafka
  • Operationalize data management and governance tools on an open source framework
  • Work with a team of engineers and developers to deliver against the overall technology data strategy
  • Integrate with BI tools and work with user interface engineers to visualize the output of queries and analytics performed on the data
  • Drive knowledge management practices for key enterprise data platforms and collaborate within solution design and delivery
present
Chicago, IL
Senior Big Data Platform Engineer
Chicago, IL
Dicki-Mosciski
present
Chicago, IL
Senior Big Data Platform Engineer
present
  • You are a self-starter and can work independently on technical projects but also work collaboratively with project team members through an agile development
  • You will perform Administration, Performance Tuning & Capacity Planning on Large Scale Clusters (Hadoop/Kafka)
  • Understand which technologies work in a multi-tenant environment and which do not
  • Participate in collaborative software and system design and development of the new platform
  • Work on large-scale, multi-tier big data engagements
  • You are involved in usage/performance forecasting/modeling and monitoring
  • Understand the challenges of working across data centres
Education Education
Bachelor’s Degree in Computer Science
Bachelor’s Degree in Computer Science
University of Florida
Bachelor’s Degree in Computer Science
Skills Skills
  • Your online identity: Github, Twitter, blog, and portfolio. If you don't have a profile on Github, links to code samples are also helpful
  • Very strong technical communication, written and spoken
  • Experience with solving problems and delivering high quality results in a fast paced environment
  • 1+ years experience with Linux/Unix experience including basic commands, shell scripting and solution engineering
  • Experience with various tools and frameworks that enable capabilities within the data ecosystem (Hadoop, Kafka, NIFI, Hive, YARN, HBase, NoSQL)
  • Experience developing data solutions on AWS
  • 1+ years Hadoop administration experience
  • Cover Letter that answers the question: what is the toughest thing you've built that you're proud of?
  • Experience with NIFI specifically with developing custom processors and workflows
  • Experience managing NOSQL databases such HBASE , Casandra, MongoDB
Create a Resume in Minutes

15 Data Platform Engineer resume templates

1

Semantics Data Platform Engineer Resume Examples & Samples

  • Candidates must possess a strong understanding of semantic knowledge base modeling techniques including knowledge store technologies; RDF/OWL based Ontology modeling, SPARQL query engines, & conceptual data modeling concepts
  • Candidates must also fully understand software development life cycle principles, practice, and disciplines
  • Candidates should also have strong knowledge of ETL concepts, data integration, big data, and data migration
  • Able to multi-task and work in a dynamic, fast-pace environment
  • Experience with database internals, transactions, query processing and optimization is a huge plus
  • Ability to work with Python (our server-side language of choice) is desired but is not a must
  • Strong understanding of relational, NoSQL, and graph databases is highly desired
  • Experience of writing compilers and parsers
2

Big Data Platform Engineer Resume Examples & Samples

  • Expert in Hadoop, Storm, Kafka, Elasticsearch or similar open source project
  • Previous experience with Java, Python, or Ruby
  • Bachelor's degree in data management, computer science, or related field preferred
  • Six or more years of experience in data management or computer programming
  • Experience working with programming concepts
  • Experience working with company database and statistical software
  • Strong communication skills to interact with team members, clients, and support personnel
  • Good analytical and problem solving skills for selecting data tools and resolve issues
3

EEA Junior Data Platform Engineer Resume Examples & Samples

  • Design and drive the implementation of data platform functional and non-functional requirements
  • Be an owner of the implemented functionality: from design and implementation to production troubleshooting
  • Work closely together with the design team members and also other teams in the organization (integration and verification team, product documentation)
  • Mentor/coach junior design team members during day to day work
  • Be responsible for the quality of the technical solution
  • Drive innovation by coming up with new ideas related to our product/our processes
  • B.S. in Computer Science or Engineering or a related field; MS preferred
  • Strong programming skills in Java SE
  • Power Linux/Unix user skills (good command of shell scripting )
  • Familiarity with the Hadoop ecosystem is an advantage
  • Strong ad pragmatic problem solving skills
  • Communication and cooperation skills
4

Junior Big Data Platform Engineer Resume Examples & Samples

  • Ensure conceptual and architectural integrity of the platform
  • Play active role in Communities of Practice
  • Attend conferences / meetups
  • Understanding of Entity Resolution and Record Linkage
  • Ability to work in a dynamic and flexible environment
5

Big Data Platform Engineer Resume Examples & Samples

  • Participate in collaborative software and system design and development of the new platform
  • Explore and evaluate new ideas and technologies
  • Experience in working with un-structured or semi-structured data
  • Experience with Hadoop Ecosystem like Oozie, Hive, Avro, Parquet etc
  • Experience in building self-contained applications using Docker, Vagrant. Chef
6

Senior Big Data Platform Engineer Resume Examples & Samples

  • Work on large-scale, multi-tier big data engagements
  • Be a mentor and role model to less experienced developers
  • Ability to work in a dynamic and flexible environment with the ability to influence
7

Senior Big Data Platform Engineer Resume Examples & Samples

  • Understand the challenges of building and managing large scale systems
  • Understand the challenges of working across data centres
  • Understand which technologies work in a multi-tenant environment and which do not
  • Deploy and test out new technologies like Kylin, Apex and NiFi
  • Lead with a positive attitude and by example
  • Linux administration (RHEL/CentOS 7 preferred)
8

Senior Big Data Platform Engineer Resume Examples & Samples

  • Responsible for one large project at all times
  • Build analytical applications using Hadoop eco system
  • Build applications using MapReduce/Java, Spark/Scala
  • Build web services using Java tech stack
  • Work on performance optimizations on Hbase and Solr
  • Work on performance optimization on Java MapReduce jobs
  • Work on Performance optimization on Spark Jobs
  • Debug complex production scenarios
  • Build complex data pipelines
  • A Bachelor¹s degree in Computer Science, Management Information Systems or related field, or equivalent work experience
  • Experience building and managing complex products and solutions
  • Experience developing Restful web services in any Java framework
  • Experience building data pipelines
  • Experience working in Linux/Unix environment
  • Expert level programming in Java
9

Principal Data Platform Engineer Resume Examples & Samples

  • 50% Data Platform Strategy and Execution
  • Design and build across the comprehensive component set of the Cargill Data Platform supporting a Platform as a Service model that enables Big Data, Data Management, Analytics and Reporting
  • Design and build future state ingestion, storage, access and analytics frameworks and capabilities to meet the needs of Cargill
  • Deliver coaching on data management and analytics techniques and technologies which add in open-source and in-house developed software platforms that will ensure automated and continuous testing, integration, and deployment of software and infrastructure across multiple cloud providers as well as internal datacenters
  • Design and build complex enterprise solutions that solve business problems using technology
  • Lead the development of a developer focused environment that allows for a natural delivery method to fit multiple developer personas (i.e. Java/.Net)
  • Assist in driving a software delivery model that supports the current multi-location, multi-continent, multi-cultural operating framework of Cargill
  • Lead and participate in continual analysis and planning to ensure Global IT toolsets and technologies are relevant, reliable and cost effective
  • 15% Business Relationship Management and Consulting
  • Work with ScrumMasters, Product Owners, Development Leads and QA Leads in CI/CD, Source Code Management, Containerized solutions and cloud technologies, especially with respect to Data Management and Advanced Analytics
  • Regularly interface with architects, analysts, process designers, and BU/Function subject matter experts to understand and evaluate data and analytics capabilities and/or functional capability requirements
  • Partner with Businesses to determine functional requirements and translate into platform specific design (including ingestion and storage patterns)
  • 15% Architecture Definition Methodology and Implementation
  • Agile Training/Tools: Responsible for working as part of a matrixed team to define and provide hands-on training for all critical software delivery tools and processes as well as the supporting tools that teams will use. You will also be expected to provide input for which toolset will best support our operating needs
  • Assess and help drive adoption of new technologies and methods within the team and across Cargill
  • Build prototype to prove out concepts
  • 10% Coaching and Collaboration
  • Coach and mentor development teams on usage and adoption of our Continuous Delivery toolsets and overall infrastructure as code and automation best-practices in the context of the Cargill Data Platform
  • Work closely with application teams looking to shift to a more iterative delivery model and ensure that their full-stack is fully automated, tested, and successfully packaged into production releases
  • Build and coach on technical capabilities to enable faster innovation, accelerate time-to-market for all of our consumer experiences, and deliver industry leading developer experiences
  • Work closely with our application teams to ensure all capabilities align with actual application delivery needs and pain points
  • Mentor Architects and Engineers on Cargill teams
  • 10% Run Operations
  • Collaborate with Ops and Architecture organizations to maintain awareness on the health of overall Data Platform
  • Bachelor’s Degree in IT or Business Related field or equivalent work experience
  • 15 years of IT and business/industry work experience and at least 5+ years influencing senior level management and key stakeholders
  • Experience with Hadoop and YARN based architectures
  • Experience with infrastructure automation, infrastructure as code, automated application deployment, monitoring/telemetry, logging, reporting/dashboarding
  • Experience in building high-performance infrastructures that are scalable and resilient
  • Experience with container technologies, e.g. Docker, etc
  • Experience with test-driven development frameworks for application and infrastructure code
  • Ability to articulate complex architectures in a concise way
  • Ability to create clear and detailed technical diagrams and documentation
  • Experience with cloud-based infrastructure as a service platforms, e.g.AWS, Google Compute Engine, Azure or OpenStack
  • Experience with programming and scripting languages, primarily Javascript, .Net and Ruby
  • Experience with configuration management and automation tools such as: Chef, Puppet, Salt and Ansible
  • Experience with development using Github, TSVS and TFS
  • Experience with Windows and Linux systems administration
  • Experience with the Agile mindset
  • Ability to travel up to 20% including international
  • Bachelor’s or Master’s Degree in technical or business discipline
  • Experience with business case development
  • Knowledge of all components of enterprise architecture
10

Data Platform Engineer Resume Examples & Samples

  • Engineering and integrating Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, Hbase, Pig
  • Develop fast prototype solutions by integrating various open source components
  • Building efficient data platform for structured and unstructured data
  • Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Storm and Kafka
  • Utilizing programming languages like Java, Spark, Python with an emphasis in tuning, optimization and best practices for application developers
  • Developing data solutions on cloud deployments such as AWS
  • 1+ years experience with Linux/Unix experience including basic commands, shell scripting and solution engineering
  • 1+ years Hadoop administration experience
  • Experience with various tools and frameworks that enable capabilities within the data ecosystem (Hadoop, Kafka, NIFI, Hive, YARN, HBase, NoSQL)
  • Experience with solving problems and delivering high quality results in a fast paced environment
  • Experience managing NOSQL databases such HBASE , Casandra, MongoDB
  • Expertise in coding in Python, Hive, R, Spark, Java with emphasis on tuning / optimization
11

Master Data Platform Engineer Resume Examples & Samples

  • Create and enhance analytic platforms & tools that enable state of the art, next generation Big Data capabilities to analytic users and applications
  • Engineering and integrating Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, HBase, Pig
  • Code and integrate open source solutions into the data analytic ecosystem
  • Operationalization of open source data analytic tools for enterprise use
  • Be part of teams delivering all data projects including migration to new data technologies for unstructured, streaming and high volume data
  • Provide solutions for data engineers such as NIFI processors and workflows
  • Developing data management and governance tools on an open source framework
  • Hands on experience leading delivery through Agile methodologies
  • Promote a risk-aware culture, ensure efficient and effective risk and compliance management practices by adhering to require standards and processes
  • 8+ years’ work experience
  • 3+ years Linux/Unix including basic commands, shell scripting and solution engineering
  • 2+ years Hadoop administration
  • 2+ years’ experience with various tools and frameworks that enable capabilities within the data ecosystem (Hadoop, Kafka, NIFI, Hive, YARN, HBase, NoSQL)
  • Experience developing software solutions to build out capabilities on a Big Data and other Enterprise Data Platforms
  • Experience with NIFI specifically with developing custom processors and workflows
12

Lead Data Platform Engineer Resume Examples & Samples

  • Hadoop administration including building, supporting, performance tuning and monitoring clusters in an enterprise production environment
  • Manage NoSQL databases like HBase, Cassandra and MongoDB
  • Operationalization of open source data analytic tools such as Spark, Python and R for enterprise use
  • Experience developing data solutions on AWS
13

Data Platform Engineer Resume Examples & Samples

  • Develop high quality high performance distributed systems in Java, Python, C/C++
  • Develop upon and integrate with other services within the Facebook application and development stacks
  • Work closely with other teams to understand and mitigate issues and improve performance
  • Work closely with Facebook operations teams to help develop and optimize solutions
  • B.S. degree in Computer Science or a related technical field
  • 2+ years experience programming with Java, C/C++, Python or similar programming language
  • 2+ years experience with distributed databases or distributed systems
  • 2+ years experience with Unix/Linux systems
14

Big Data Platform Engineer Resume Examples & Samples

  • Provides advice and guidance to Functional Leads, Senior developers, and other IS functions
  • Partner with Solution Architects and consult with external vendors to define and enforce data warehouse and BI standards
  • Create and review technical design documentation to ensure the accurate development of solutions
  • Define data dependencies between source systems and target outcomes (source, data lake, BI reporting etc.)
  • Partners with appropriate business analysts, project sponsors, and EDAA leadership to translate business requirements into systems solutions
  • Enforce standards connected to the maintaining of the Big Data Platform and ecosystem tools and services, and review compliance to the agreed standards across the team
  • Guide and mentor the wider team members through technical solution deliverables
  • Demonstrate advanced knowledge of company source data and ability to self-educate when new business functionality presents itself
  • Provide technical thought leadership, compare and contrast different technologies to meet business requirements and cost control drivers
  • Work with Business and IT groups to design and deliver a data integration and workflow framework that provides for the loading of data, data integration, quality management, aggregation, joins etc. and distribution to various systems using advanced Big data technologies
  • Impact assessment of changes to the platform due to key projects
  • Own and be accountable for producing quality deliverables and technical assurance/governance of the Big data solutions and services to ensure architecture scalability, extendibility, performance and bug free code
  • Work with key members of various IS teams to identify and resolve application issues, system enhancements, and design for new user requirements
  • Technical advisor, Development Manager/Technical Leader to ensure development and automated testing of solutions as per architecture and design
  • Produce & maintain the overall solution design for the entire Big Data Platform
  • Drive the execution of data strategy, design and architecture of solutions on the platform
  • Proactively seek and assess new use cases, evaluate and recommend 3rd party products (e.g. archiving, encryption, NoSQL databases) which helps fill gaps within the service offering and map to the use cases
  • Enforce technical best practices for Big Data management and solutions, from software selection through to technical architectures and implementation processes
  • Document and publish best practices, guidelines, and training information
  • Work with Business consumers and BI development teams to enable data services on the Big Data Platform for integrating partner technologies like Teradata, Oracle, SQL server, Power BI, SAS, and R etc
  • Own the management of technical environments, version control tools and processes, automation of build/deployment processes, release management, quality control of solutions
  • Go-to person for technical guidance, help and expertise to consumer questions on technologies and services on the platform like Hive, Pig, Power BI, SAS, R etc
  • Ensures all functional solutions and components of the Big data service offerings, as a whole are designed and implemented in such a way that they always meet SLAs
  • Contributes to the continuous improvement of the support & delivery functions by maintaining awareness of technology developments and making appropriate recommendations to enhance application services
  • Bachelor’s degree in Computer Science, Software or Electronic Engineering or other related discipline
  • Strong and demonstrable people management experience managing cross functional teams with proven experience of creating an environment that supports and inspires people to develop and deliver
  • Very articulate with excellent written and spoken communication skills negotiating effectively and winning over audiences with compelling and persuasive presentations
  • Capable of articulating complex designs, code and applications for large scale projects to a simple to understand non-technical client-facing standard
  • A track record of making complex business decisions with authority, even in times of ambiguity, considering the potential long term risks and implications
  • Proponent for innovation, best practices, sound design with data & information optimization in mind, strong development habits and efficient team/project structures
  • Passion for technology & understanding how things work
  • Ability to work occasional weekends and varied schedule. (e.g. during go-live)
  • Have a strong development background with deep knowledge of large scale enterprise business systems, MPP, distributed computing systems and business intelligence
  • A very strong SQL background and experience but capable of thinking in terms of networks rather than tables
  • Demonstrable evidence of in-depth functional and technical knowledge in
  • Hadoop(Hortonworks), Yarn, Map Reduce,Tez, Spark
  • Databases (DB2, Oracle, Oracle RAC, MS SQL, MySQL etc.)
  • CPU and I/O concepts, Distributed File Systems, cluster and parallel compute architectures, In Memory, Distributed Data Grid, cloud computing
  • Enterprise Application and information integration
  • Data Dictionaries, Data Management tools and methodologies
  • In-depth knowledge or experience of
  • MPP appliances (Teradata, Exadata, PDW, etc)
  • Inspiring, motivating and positive attitude who does not hesitate taking up any challenge
  • Has a positive mind-set who demonstrates a can-do-attitude rather than exhibiting known issues or other blockers as possible disrupting agents in meeting delivery and quality targets
15

Senior Data Platform Engineer Resume Examples & Samples

  • 75% -Data Engineering
  • Support the design, build, test and maintaining of Cargill’s Big Data and Advanced Analytics platform
  • Design, build and test API’s that will enable simple use of complex datasets
  • Data Modeling that is performance and reusable
  • Develop interactive dashboards, reports, and perform analysis
  • Execute strategies that inform data design and architecture in partnership with enterprise-wide standard
  • 10% - Business Partnering, Relationship Management and Consulting
  • Regularly interfaces with architects, analysts, process designers, and BU/Function subject matter experts to understand and evaluate business requirements
  • Works with business to determine functional requirements and then translate those into platform specific design (including ingestion and storage patterns)
  • 10% - Governance and Project Consulting
  • Demonstrates subject matter proficiency in the design, implementation and deployment of new software version, infrastructure, and processes for a large portfolio of services, spanning a significant subset of the organization
  • Utilizing substantial knowledge of data practices and procedures, conducts quality assurance evaluations on new and existing technology
  • Provides expertise while collaborating with other partner IT teams, internal and external to Cargill
  • 5% - Run Operations
  • Collaborate with AMS and Architecture organizations to maintain awareness on the health of overall Data Platform and monitor scorecard metrics
  • Provide support in the transition of big data solutions from the build organization to the operations organization
  • 10 + years experience in Information Technology field
  • 7 + years developing software applications including: analysis, design, coding, testing, deploying and supporting of applications
  • 2 + years experience with end-to-end software development processes and practices (agile/scrum experience preferred)
  • Proficient in application/software architecture (Definition, Business Process Modeling, etc)
  • Understanding of Big Data technology; current on new ideas and tools
  • Good understanding of the Hadoop ecosystem and low level constructs
  • Broad understanding of object-oriented and functional programming paradigms
  • Experience in *nix environment (e.g., ssh and standard commands)
  • Demonstrated collaboration skills, able to engage in interactive discussions with technical and business-oriented teams
  • Experienced with at least one scripting language (e.g., bash/python)
  • BS degree in Computer Science, Applied Mathematics, Physics, Statistics or area of study related to data sciences and data mining
  • Experience developing complex MapReduce programs against structured and unstructured data
  • Experience with loading data to Hive and writing software accessing Hive data
  • Experience loading external data to Hadoop environments using tools like MapReduce, Sqoop, and Flume
  • Experience using scripting languages like Pig to manipulate data
  • Experience interfacing with data-science products and creating tools for easier deployment of data-science tools
  • Experience in extending open-source Hadoop components
  • Experience with Scala and/or Spark. Technical Skills Experience with Front End BI tools (Tableau, PowerBI, Business Objects)
16

Data Platform Engineer Resume Examples & Samples

  • Confidence in the ability to write highly concurrent programs in Java
  • A demonstrated ability to learn new technologies quickly
  • Deep passion for building great software
17

Senior Big Data Platform Engineer Resume Examples & Samples

  • You will own the design and implementation of the software systems that power critical data-flows across the enterprise
  • You will build and maintain the data infrastructure, tooling, and upkeep of the platform
  • You will perform Administration, Performance Tuning & Capacity Planning on Large Scale Clusters (Hadoop/Kafka)
  • You will be involved in prototyping, architecting, and implementing/updating the data infrastructure platform
  • You are involved in usage/performance forecasting/modeling and monitoring
  • You will conduct peer reviews and code walk-throughs for infrastructure code
  • You are a self-starter and can work independently on technical projects but also work collaboratively with project team members through an agile development
  • Programming skills; You are comfortable writing code in multiple languages, confident in choosing the right strongly or dynamically typed language for the job. Preferred language familiarity: Java (primary) with Python, Ruby, Scala, or Go (secondary)
  • Database skills; You understand use cases for relational and non-relational data, you’ve implemented code against several different database platforms. Bonus: maybe you’ve even been a DBA in a past life
  • Systems administration; You’re a hardened Linux systems administrator. You have opinions on what a production ready system is. Managing and diagnosing issues on mission critical systems comes second nature. Excellent shell scripting skills are expected
  • Strong familiarity with scalable web service architecture patterns and frameworks
  • Automation; Managing infrastructure as code is the only solution in your book. You have hands on experience writing code for some of the major configuration management systems (Puppet/Chef/Ansible/etc.)
  • Experience with Hadoop, Elasticsearch, Cassandra, Kafka, Mesos, Spark, Splunk, Zookeeper, Puppet, Docker is preferred
  • Understanding of code promotion, CI/CD best practices, Platform as a Service Architecture, and Distributed systems orchestration
  • Understanding of schedulers, workload management, availability, scalability, load balancing
  • Application clustering / load balancing concepts and technologies
  • Experience working in an agile team environment
  • Committed to Open Source Projects. Please provide Github links if appropriate
  • Conduct code walk-throughs, peer reviews, and produce technical documentation
18

Big Data Platform Engineer Resume Examples & Samples

  • Ownership and accountability of the management and administration of the Big Data Platform
  • Manage and own operating system upgrades and executing information and platform security on the Big data platform
  • Plan, Manage and perform HDP upgrades, connected ecosystem product upgrade as per roadmap; ensure alignment between British Gas’s requirements and the new versions made available in open source technology partner communities
  • Ownership and accountability of designing and implementing proactive capacity management, with responsibility for creating and managing estimation, recharge and prediction models, ensuring the provision of new Data nodes to expand the Big data infrastructure proactively
  • Ownership of enterprise grade build, configuration, administration and management of the Development and Production instances of the Big data platforms and technologies implemented in British Gas
  • Manage and administer British Gas Big Data service & infrastructure offerings acting as the liaison between the IT Enterprise, Solution architecture, development communities, Operations and the IT infrastructure groups across British Gas business units
  • Ensures all components of the Big data service as a whole (and Hadoop with all ecosystem components in particular) is running normally and to SLA’s
  • Troubleshoot cluster & ecosystem product issues as a priority with expertise & lead through to resolution
  • Produce and manage advance workload characterization, benchmarks and metrics
  • Become the first level go-to person for technical guidance, expertise support to consumer questions on operating services on the platform
  • Background in IT infrastructure, large scale enterprise business systems & distributed systems preferable
  • Hands on experience working within a Linux/Unix environment
  • Advantageous to have experience with running components in Hadoop ecosystem (Hive, Pig, Ambari, Oozie, Sqoop, Zookeeper, Mahout, Hadoop HDFS, YARN and MapReduce)
  • Experience in building and managing NoSQL Database like HBase or Cassandra
  • Knowledge of security models for Big data platforms (Kerberos, Knox etc)
  • Distributed File Systems, cluster and parallel computer architectures
  • Hadoop distributions (preferably Hortonworks), Map Reduce & Yarn o Databases (Oracle, Oracle RAC, MS SQL, MySQL etc)
  • In Memory, Distributed Data Grid, cloud computing
  • High availability and contingency solutions, workload management
  • Multiple technologies including Java, C/C++, Unix Platforms and large scale implementations of programming theories/concepts with proven results
19

Senior Data Platform Engineer Resume Examples & Samples

  • Experience working in an agile environment with rapid iterations of small amounts of functionality delivered frequently
  • Extensive experience in building, ideally distributed, services in Scala or Python
  • Experience modelling and querying data in a Lucene based search solution (ideally ElasticSearch)
  • Experience building data ingest pipelines using Apache Kafka
  • Extensive experience with at least one NoSQL/NewSQL database (Kudu, Cassandra, Redis, Riak, Couchbase etc.) at reasonably large scales
  • A B.S. degree in Computer Science or an equivilent field
  • Experience with Akka, ideally in Scala
  • Writing, debugging and performance tuning Spark applications. Experience with Spark Streaming, Spark SQL and DataFrames using Parquet as an underlying store
  • Experience in building statistical models using R
  • Implementing machine learning models (in any framework). Implementing them on Spark MLLib
  • Working experience with Jenkins including the Pipeline plugin
  • Experience using at least one of Ansible, Chef, Salt or Puppet
  • Working experience with Docker containers
  • Working experience with Zookeeper is a plus
  • Experience with Zoomdata, Tableau or other visualization tools is a strong plus
  • A M.S. or PhD in Computer Science or an equivilent field
20

Data Platform Engineer Resume Examples & Samples

  • Java , Web, Spring (MVC/DAO/ORM/web services/Security/AOP), JAXB, JPA, log4j, Hibernate, AOP, Maven, Struts 2, , AJAX, CSS, JavaScript (especially jQuery),SOAP, REST (especially Jersey)
  • Good judgement around design decisions. Must know when to hardcode values, when to read them from property files or database configuration. Must know when to use a class function, separate service, stand alone POJO etc
  • Expertise of UNIX system concepts, especially process management and securityExcellent awareness of benefits and issues distributing automation across the network, especially with HTTPsComfortable with internal database concepts and ability to write and tune advanced SQL
  • Very strong technical communication, written and spoken
  • Entrepreneurial innovation attractive. Can you see where inventive effort would save the business cost or risk?
  • 5-10 years experience of java and web development
  • Track record of implementing SDLC with regimented release procedures for business critical systems
21

Big Data Platform Engineer Resume Examples & Samples

  • 2 years of experience building and managing complex products/solutions
  • Proven track record of Administering / Engineering Distributed Solutions dealing with real high volume of data
  • Understanding of virtual machine technologies, physical machines, networking and storage systems
  • Demonstrates Awareness of multiple infrastructure disciplines
  • 2 + years of experience with distributed, highly-scalable, multi-node environments
  • Familiarity with Big Data Technologies (Solr, Hive, HBase, Spark, Kafka, Yarn ,Storm, Splunk), understands the concepts and technology ecosystem around both real-time and batch processing in Hadoop
  • Familiarity in Dev/Ops (Puppet, Chef, Python)
  • Awareness of web technologies and protocols (JAVA/NoSQL/JSON/REST/JMS)
  • Excellent written/oral communication skills - You speak and write English fluently
22

Senior Data Platform Engineer Resume Examples & Samples

  • Engineering and integrating Hadoop modules such as YARN & MapReduce, and related open source projects such as Hive, HBase, Pig into the data analytic ecosystem
  • Industry experience in Financial Services or other regulated industries
  • 3+ years’ work experience
  • 2+ years Linux/Unix including basic commands, shell scripting and solution engineering
  • 0 – 1 years Hadoop administration
  • 0 – 1 years’ experience with various tools and frameworks that enable capabilities within the data ecosystem (Hadoop, Kafka, NIFI, Hive, YARN, HBase, NoSQL
  • Experience working with automated build and continuous integration systems (Chef, Jenkins, Docker)
23

Big Data Platform Engineer Resume Examples & Samples

  • Leverage deep middleware and big data technology expertise to deliver new products and solutions and ensure that all areas in the organization that provision, manage and support the middleware technologies have the tools, processes and documentation they require to effectively execute their roles
  • Implement/design Big Data Services on Public Cloud and leverage various managed services (Azure, AWS…)
  • Execute strategy as it relates to the introduction of new products, tools and the automation of build, test, release and configure activities across Application, Platform and Infrastructure
  • 5+ years of in depth experience with middleware and big data technologies including: Hadoop, Spark, Kafka, Mesos,/Yarn, JBoss, Hbase, HDFS etc… along with its upstream/downstream interaction points across multiple platform domains and technology areas
  • Experience in Data platforms such as: NoSql, distributed/in-memory caching (MongoDB, Cassandra…)
  • Experience in cloud environments and engineering middleware and big data platforms for cloud environments (Kafka
  • Strong understanding of operating platform stacks including Redhat Linux, Windows, OpenStack
  • Experience in developing strategies, roadmaps and designs for large-scale organizations. Enabling the enterprise the ability for rapid growth and scalable solutions that do not require extensive manual intervention
  • In depth knowledge of scripting tools and configuration management software (Python, Powershell, Perl, Saltstack, etc..) to enable extensive automation of our products and technologies for provisioning and management of the systems
  • Experience in the development and enabling PaaS\IaaS capabilities in the context of big data or middleware products and servies
  • Utilize subject matter expertise to introduce new disruptive technologies, or new versions of existing technologies, into the organization
  • Ability to review, analyze and develop the tools and processes required by the organization to efficiently manage, support and provision engineered platforms
  • Design highly available platforms that meet strict Security and compliance policies required by our bank systems
  • Proven ability to collaborate across a large organization to effectively realize outcomes
  • Proven ability to leverage deep subject matter expertise to develop and deploy a vision and align others to that vision. Known for providing creative thought leadership while also listening and engaging others to provide input in the shaping of that vision
  • Adapts to new, different or changing requirements , quickly grasps new concepts, adapts and reflects on lessons learned – comfortable with ambiguity, analyzes and evaluates, defines problem/challenge, identifies alternatives and makes timely decisions
  • Exhibits comfort and maintains composure with audiences at all levels, tailors communication style and delivery to different audiences, uses effective listening skills to gain clarification from others
  • Uses knowledge of formal and informal policies and organizational interconnections to effectively navigate boundaries
  • Strong influence and persuasion skills – gets people to change position or course of action through conversations and debate
  • Work collaboratively across the ITS leadership team to gain support, resourcing, and drive results
24

Big Data Platform Engineer Resume Examples & Samples

  • Manage and own the system upgrade and platform security on the Big data platform
  • Plan, Manage and perform platform upgrades, connected ecosystem product upgrade as per roadmap; ensure alignment the new versions made available in open source and technology partner communities
  • Ownership and accountability of designing and implementing proactive capacity management
  • Ownership of build, configuration, administration and management of Big data platforms and technologies
  • Troubleshoot cluster & ecosystem product issues as a priority with expertise & lead through to resolution working to SLA’s as defined within the Operational environment
  • Troubleshoot network and infrastructure issues with expertise & lead through to resolution working to SLA’s as defined within the Operational environment
  • Define and build lightweight (low-overhead) monitoring system characteristics, in real time, and tools for correlating and analysing those statistics for the Big Data infrastructure to ensure good health of the infrastructure
  • Instil a service mind-set and apply a strong focus on business continuity, Troubleshooting to ensure a good service
  • First level go-to person for technical guidance, expertise support to consumer questions on operating services on the platform
  • Serve as second level support to Big data operations for infrastructure and component services
  • Work with engineering support on upgrades of the cluster participating in the installation, configuration and administration of multi-node Big data platforms
  • Implementing new services to agreed SLA’s and ensuring transition to support or standardise services to Global services
  • Experience Required
25

Data Platform Engineer Resume Examples & Samples

  • MS or PhD in subjects ranging from computer science, electrical engineering, mathematics, operations research, astrophysics, to cognitive neuroscience
  • Strong skills in statistics, machine learning, and/or deep learning
  • Exceptional coding and design skills in Python, Java/Scala (additionally C++ & R is a plus)
  • Experience with different flavors of SQL & modeling relational tables
  • Love to work autonomously and take ownership of projects
  • Are naturally curious and get excited to dig in and understand how things work
26

Senior Associate Data Platform Engineer Resume Examples & Samples

  • Accountable for designing and delivering against New York Life’s data technology strategy
  • Work with a team of engineers and developers to deliver against the overall technology data strategy
  • Collaborate with peers across Enterprise Data Management, to deliver on the overarching strategy
  • 5+ years in a variety of technology – especially Linux, Web, Databases and Big Data (Hadoop)
  • Knowledge of Hadoop, NoSQL DBS (eg – MongoDB, MarkLogic, etc) and insights on when to recommend a particular solution
  • Understanding of Unix/Linux (building/assembling packages, shell scripts, configuration management and OS tuning)
  • Hands on experience with Hadoop technologies (YARN, Hive, MR, Tez, Spark, etc.)
  • Understanding of Networking (tracing, packet capture, etc.)
27

Senior Data Platform Engineer Resume Examples & Samples

  • Drive knowledge management practices for key enterprise data platforms and collaborate within solution design and delivery
  • Develop framework, metrics and reporting to ensure progress can be measured, evaluated and continually improved
  • Deep expertise in data related tools including latest data solutions (eg – Big Data, Cloud, In Memory Analytics, etc.)
  • Hands-on experience with Hadoop, NoSQL DBS (eg – MongoDB, MarkLogic, etc) and insights on when to recommend a particular solution
  • Strong understanding of Hadoop technologies (YARN, Hive, MR, Tez, Spark, etc.)
  • Experience with enabling Kerberos and best practices for securing data a plus
  • Experience working with Vendors/Open Source in the Hadoop ecosystem
  • Experience with version control and continuous integration (Git, Bamboo, Jenkins)
28

Associate Data Platform Engineer Resume Examples & Samples

  • Ensure enterprise data platforms are standardized, optimized, available, reliable, consistent, accessible and secure to support business and technology needs
  • Stay current and informed on emerging technologies and new techniques to refine and improve overall delivery
  • 3+ years in a variety of technology – especially Linux, Web, Databases and Big Data (Hadoop)
  • Familiarity in data related tools including latest data solutions (eg – Big Data, Cloud, In Memory Analytics, etc.)
  • Experience in standing up enterprise systems
  • Familiar with Unix/Linux (building/assembling packages, shell scripts, configuration management and OS tuning)
  • Understanding of Hadoop technologies (YARN, Hive, MR, Tez, Spark, etc.)
  • Knowledge of the open source community (opening issues, tracking issues and identifying problematic issues ahead of time by tracking open JIRA issues in the community)
29

Lead Data Platform Engineer Resume Examples & Samples

  • 8+ years core Java experience: building business logic layers and back-end systems for high-volume data pipeline backend applications
  • Current experience developing microservices
  • Current experience using Java development, SQL Database systems, and Apache products (Tomcat, Spark, Hadoop, Cassandra)
  • Current experience using Scala development
  • Current experience high-speed messaging frameworks and streaming (Kafka, akka, reactive)
  • Current experience developing and deploying applications to public cloud (AWS)
  • Experience with DevOps tools (GitHub, Jira) and methodologies (Agile, Scrum, Kanban, Test Driven Development)
  • Refactor early and often to continuously improve code quality
  • Excellent analytical/troubleshooting skills
  • BS in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience
30

AD, Big Data Platform Engineer Resume Examples & Samples

  • Perform configuration, patching, upgrades of the Cloudera environments and associated tools
  • Own and resolve any opportunities or issues related to the operations of the platform across multiple tenants
  • Create detailed designs and POCs to enable new workloads and technical capabilities on the Platform. Work with the platform and infrastructure engineers to implement these capabilities in production
  • Create full visibility into the health and utilization of the platform through use of real-time dashboards, alerts and other mechanisms
  • Manage workloads and enable workload optimization including managing resource allocation and scheduling across multiple tenants to fulfill SLAs
  • Provide Level-3 support and coordinate support from relevant vendors
  • Coordinate activities with Cloud Infrastructure and other enabling IT teams
  • Ensure platform SLAs are met
  • Reports KPI and Metrics
  • Participate in planning activities
  • Participate in Data Science CoP and perform activities to increase platform skills
  • Minimum of 3-5 years working in a Big Data service delivery (or equivalent) roles focusing on the following disciplines
  • Platform/IT engineering and/or deployment
  • Solution architecture
  • Big Data Analytics / Business Intelligence / Data Warehousing
  • Minimum of 5-10 years of combined hands-on experience with the following techniques and tools
  • Big Data (Hadoop ecosystems/distributions e.g. Cloudera/HortonWorks/EMR)
  • Data integration (ETL/ELT)
  • Solid security expertise, in particular with Kerberos and Active Directory
  • Hands on experience with managing solutions deployed in the Cloud, preferably on AWS
  • Cloudera Engineer Certification is a plus
  • Experience working in a Global company and an on-shore/off-shore operating model
  • Experience working in a DevOps model is a plus
  • Understanding of IT Processes and Frameworks such as ITIL and SDLC
  • Capability for being thoughtful, extroverted and collaborative