Our client are a large global company currently sitting high in the Fortune 500. They have made a significant investment in Dublin. They have set up a new Innovation Lab and are currently seeking an experienced Senior Java developer with some lead experience and someone who is comfortable working with enterprise Big Data solutions. This is a great opportunity to join what will be one of the most significant R&D labs set up in Dublin in the last couple of years.
Responsibilities:
• Providing guidance to the Scrum team as a Scrum Master
• Providing hands-on leadership for the design and development of ETL data flows using Hadoop
and Spark ECO system components
• Leading the development of large-scale, high-speed, and low-latency data solutions in the areas of
large scale data manipulation, long-term data storage, data warehousing, Low-latency retrieval
systems, real-time reporting and analytics Data applications – visualization, BI, dashboards, ad-hoc
analytics
• Design & implement work flow jobs using Talend & Unix/Linux scripting to perform ETL on Hadoop
platform
• Translate complex functional and technical requirements into detailed design
• Perform analysis of data sets and uncover insights
• Maintain security and data privacy
• Propose best practices/standards
• Work with the Scrum Master, System Analyst and development manager on a day-to-day basis
• Work with service delivery (support) team on transition and stabilization
• Design and develop optimal solutions for the implementation, maintenance and monitoring of all
component software
• Responsible for integrating all development with the architecture used across the company
• Help shape and augment teams continuous delivery capabilities and utilize DevOps best practices
to improve delivery automation, collaboration, and efficiency
Qualifications:
• A Masters Degree in Engineering, Information Systems, technical certification in programming,
data processing, or relevant experience
Essential Qualifications:
• 5+ years of hands-on experience with either distributed system such as Java/Python or any ETL
tools as developer required
• 7+years of hands-on experience in designing lead & developing enterprise application solutions
for distributed systems
• 2 plus years Expert-level skills and experience in Hive, Pig & Map Reduce and Hadoop Eco system
Framework
• Experience leading Scrum teams and working on Scrum tools like Rally
• Experience with HBase, Talend, NoSQL databases-Experience Apache Spark or other steaming big
data processing preferred
• Understanding of database internals and data warehouse implementation techniques, working
knowledge of SQL
• Solid understanding of data structures & common algorithms Understanding of time-complexity of
algorithms
• Able to work well in extremely fast paced and entrepreneurial environments Independent self-
starter who is highly focused and results oriented
• Strong analytical and problem solving skills
• Expert-level skills and experience in Java or any distributed programming experience
• Programming and language skills:
o Expert-level skills and experience in Hadoop ECO system
o Strong in Java and PIG, SQL
o Knowledge on Python & Scala
o Strong knowledge loading from disparate data sets
o Pre-processing using Hive/Pig/ Python/ Unix scripting
o Good skills and experience in Linux, Java, XML, JSON, REST
o Experience with UNIX and Shell Scripting
Desired Qualifications:
• PMP and Scrum Master Certification
• Experience with HBase, Talend, NoSQL databases- Experience Apache Spark or other streaming big
data processing preferred
• Previous experience with Healthcare industry
• System design and implementation experience
• Familiar with fault tolerance system design and high performance engineering
• Fundamental concepts of scheduling, synchronization, IPC and memory management
• Familiarity with information retrieval techniques is preferred
To apply for the Java Applications Team Lead-Big Data role please contact Hugh McCarthy on 01 662 1000.
Email me jobs like this