Job Description :
Job Description
Job Description
- DevOps, AWS Experience is Must .
- AWS Big Data : Have experience of setting up integrated was big data platform using AWS Kafka , Kerberos , Apache Airflow , Glue Catalog , Athena , Redshift
- AWS : Working experience and good understanding of the AWS environment, including VPC, EC2, EBS, S3, RDS, SQS, Cloud Formation, Lambda , Glue,Athena Redshift etc
- Previous Experience of working with Big Data platform like Cloudera , Hortonworks etc is plus
- Have worked on Setting up Data Catalog
- Experience of managing infrastructure as code ( Cloud-formation or Terraform )
- Programming: Experience programming with Python, Bash, REST APIs, and JSON encoding.
- Security : Experience implementing role based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment.
- Monitoring & Alerting: Hands on experience with monitoring tools such as AWS CloudWatch. Able to schedule alerts for unauthorised access , Job failure etc
- Backup/Recovery: Experience with the design and implementation of big data backup/recovery solutions.
- Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load balancers (ELB, HAProxy) and high availability architecture.
- Ability to keep systems running at peak performance, upgrade operating system, patches, and version upgrades as required
- Implementation of Auto Scaling for instances under ELB using ELB Health Checks OR EMR , Kubernetes
- IAM and its policy management to restrict users to particular AWS Services
Required Skills
DevOps, Big Data, Kafka, Kerberos, Apache Airflow, Redshift, Programming, Python, Bash, REST, JSON Encoding, Integration, Linux, Hadoop, Monitoring, Recovery, Design and Implementation, Peak Performance, Operating System, Access, Networking, TCP/IP Networking, SMTP, Load Balancers, ELB, HAProxy, High Availability Architecture, Implementation, Auto Scaling, EMR, Management
Source link