Careers that Change Lives
We are looking for a talented Senior Level DataOps Engineer – Principal Engineer for the R&D Engineering function of Medtronic Engineering and Innovation Center R&D facility for our Cardiac Rhythm and Heart Failure (CRHF) Software division.. The right person will be our go to person for implementing our Data Ops infrastructure and to continuously look for opportunities to increase our productivity and quality of our software releases.
A Day in the Life
KEY RESPONSIBILITIES:

  • Architect, build, and test an optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud-based ‘big data technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Understand and implement practices to comply with PHI, GDPR and other emerging data privacy initiatives.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Learn and understand software standards for Medical devices, ex. IEC62304.
  • Participate in process improvement initiatives for the software team. This includes recognizing areas for improvement as well as working with others to develop and document process improvements.
  • Provide hands-on leadership, coaching, mentoring, and software engineering best practices to junior software engineers.
  • Work under general direction and collaboratively with internal and external partners.

Education:

  • Required: Bachelor of Engineering or Bachelor of Technology in Computer Sciences or related field.
  • Preferred: – Maters in Engineering or Masters in Technology Computer Sciences, Bio-Technology from a premium institute.

Required Skills/Competencies:

  • 12+ years of Software Engineering experience
  • 5+ years of experience in a Data Engineer role
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases (SQLServer, Oracle, Apache Hive)
  • Solid experience building and optimizing ‘big data data pipelines, architectures and data sets.
  • Deep knowledge and experience with JSON and XML schemas and documents.
  • Strong analytic skills related to working with unstructured datasets.
  • Strong experience with knowledge and metadata management principles and methods.
  • Working knowledge of REST and implementation patterns pertaining to Analytics.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data data stores (Kafka, Kinesis).
  • Experience supporting and working with cross-functional teams in a dynamic environment.

Preffered Skills/Competencies:

  • Experience with big data tools: Hadoop, Hive, HBase, Spark, Kafka,
  • Experience with log management tools (Elastic, Kibana, Logstash, Beats, Fluent)
  • Experience with relational SQL and NoSQL databases, including SQLServer, MySQL, Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, Apache NiFi, StreamSets, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with Azure cloud services
  • Experience with object-oriented/object function and scripting languages: Python, Java, C++, C#, Scala, Bash, etc.
  • Experience with Windows and Linux operating systems
  • Experience with DevOps processes, including source code management (Git), automated build (Maven) and deployment
  • Experience with DevOps developments in production – Docker, Kubernetes, Drone and Jenkins

Source link