This role calls for a Hadoop expert, used to working with Linux Red Hat in a DevOps environment.
It is a well-paid contract opportunity that provides exceptional opportunities for IT career progression in a business casual financial services environment.
It is based in Eastpoint Business Park, Clontarf.
This role will sit within the CDO ‘DataLab’ Science and Innovation division, with a primary focus on supporting the companies Anti- Financial Crime (AFC) work stream to solve complex and usually confidential and sensitive problems leveraging state of the art data processing tools and analytics techniques.
The successful candidate will manage the day to day operations of the company’s environments and supporting platforms, engaging with the wider CDO platform teams to do so. They will apply a “DevOps” philosophy and contribute to all aspects of the project execution, from establishing environments through to developing solutions. They will be an expert-level administrator in more than one type of data processing technologies including Secure Hadoop (Cloudera + Kerberos+ Encryption), databases (Oracle, MySQL), data integration tools (e.g. Pentaho, Informatica), scripting and data analytical tools and languages (Python, Spark, Hive, SQL).
The candidate will be responsible for platform and application configuration management, capacity management, environment creation and readiness (software installation, configuration, etc.), Data preparation, solution development, performance tuning and solution optimization
• Configuration management and support of environments
• Creation of new capabilities, and tools and solutions (e.g. environments, containers, monitors, scripts, etc.)
• Support of project teams (best practices, technical debt remediation, design guidance)
• Creation and preparation of environments for testing (e.g. Clusters, Databases, VMs)
• Evaluation and testing of new data technologies, tools and platforms (e.g. Hadoop, Analytics)
• Engage with the wider CDO platforms team to ensure alignment and best practices
• Expert in administering Hadoop and Hadoop ecosystem technologies (Cloudera, Hive, Impala, Spark, etc.)
• Deep understanding of UNIX (Redhat) systems administration (setup, memory, tuning, scripting, networking)
• Expert in scripting and working in data processing environments (Python, Flume, Kafka, Pentaho, Informatica)
• Experience of using container technologies (VM ESX, Docker, CDH Parcels)
• Experience of using cloud environments (Google, AWS)
• Experience with SAS preferable
• Understanding of chain of custody and working in controlled/ secure environments
• Configuration management ( management and operation)
• Software distribution (Puppet, Chef, MSI)
• Monitoring and enterprise management (Geneos, Nimsoft)
• Security and identity management (LDAP, Active Directory)
• Messaging and communications (TCP/UDP)
• IT lifecycle management (ITIL)
• Educated to degree level (or overseas equivalent)
• Certified OS/ platform administrator 9e.g. RedHat, SUSE)
Candidate will be:
• A creative data technologies passionate about data/ information management and architectures
• Innovative, creative, articulate and able to work and collaborate with a broad range of stakeholders in a multi-site, multi-country global organization
• Minimum of 5 years’ experience of complex systems administration
• Be able to demonstrate examples of previous successful innovation projects delivered
To apply for the DevOps Tooling Engineer role please contact Derek Smyth on 01 662 1000.