Sorry, but this job has expired. Please try another search or browse our jobs.

Big Data Development Consultant (Hadoop)

Job Category:
Software Developer/Engineer, Analyst (Business/Systems)
Job Type:
Permanent
Level of IT Experience:
10 Years +
Area:
Central Dublin
Location:
Dublin City Centre
Salary:
€100,000 to €120,000 per annum
Salary Description:
Bonus and Benefits
Posted:
11/10/2016
Recruiter:
Allen Recruitment
Job Ref:
BBBH9780

Senior Big Data Development Consultant (Hadoop)

Dublin City Centre, Ireland

Permanent job with excellent salary plus bonus and generous benefits package

Job Ref: BBBH9780

We are assisting a high profile International Company with a new Software Development Centre in Dublin City Centre source experienced Big Data Development Consultants with expertise in JAVA and Hadoop Ecosystem.

This is a fantastic opportunity to join a team (in a leadership capacity) where you will be working on cutting edge Big Data technology related projects.

Duties: 

  • Provide hands-on leadership for the design and development of ETL data flows using Hadoop and Spark Ecosystem components.
  • Lead the development of large-scale, high-speed, and low-latency data solutions in the areas of large scale data manipulation, long-term data storage, data warehousing, Low-latency retrieval systems, real-time reporting and analytics Data applications - visualization, BI, dashboards, ad-hoc analytics
  • Design & implement work flow jobs using Talend & Unix / Linux scripting to perform ETL on Hadoop platform
  • Translate complex functional and technical requirements into detailed design
  • Perform analysis of data sets and uncover insights
  • Maintain security and data privacy
  • Propose best practices/standards
  • Work with Scrum Master, System Analyst and Development Manager on a day-to-day basis
  • Work with service delivery (support) team on transition and stabilization
  • Design and develop optimal solutions for the implementation, maintenance and monitoring of all components of software
  • Responsible for integrating all development with the architecture used across the company
  • Help shape and augment team’s continuous delivery capabilities and utilize DevOps best practices to improve delivery automation, collaboration, and efficiency 

Educational Qualifications:

  • Suitable applicants will (ideally) have a Master's Degree in Engineering or Information Systems
  • At a minimum they’ll have a Bachelor’s Degree in Software Engineering with relevant experience in data processing

Technical Qualifications / Experience: 

  • 5+ years of hands-on experience with either distributed system such as Java or Python or any ETL tools as developer (required)
  • 7+ years of hands-on experience in designing leading & developing enterprise application solutions for distributed systems
  • 2+ years Expert-level skills and experience in Hive, Pig & Map Reduce and Hadoop ECO system Framework
  • Experience leading Scrum teams and working on Scrum tools like Rally
  • Experience with HBase, Talend, NoSQL databases-Experience Apache Spark or other streaming big data processing preferred
  • Understanding of database internals and data warehouse implementation techniques; working knowledge of SQL
  • Solid understanding of data structures & common algorithms Understanding of time-complexity of algorithms
  • Able to work well in extremely fast paced and entrepreneurial environments Independent self-starter who is highly focused and results oriented
  • Strong analytical and problem solving skills
  • Expert-level skills and experience in Java or any distributed programming experience

Programming and Language Skills: 

  • Expert-level skills and experience in Hadoop ECO System
  • Strong in Java, PIG and SQL
  • Knowledge on Python & Scala
  • Strong knowledge loading from disparate data sets
  • Pre-processing using Hive / Pig / python / Unix scripting
  • Good skills and experience in Linux, Java, XML, JSON, REST
  • Experience with UNIX and Shell scripting

Desired Qualifications: 

  • PMP and Scrum Master Certification
  • Experience with HBase, Talend, NoSQL databases - Experience Apache Spark or other streaming big data processing (preferred)
  • System design and implementation experience
  • Familiar with fault tolerance system design and high performance engineering
  • Fundamental concepts of scheduling, synchronization, IPC and memory management
  • Familiarity with information retrieval techniques (preferred)

This is an excellent career opportunity to join a brand new start-up team. 

There are also requirements for Senior Application Developers with JAVA and Hadoop expertise.

IMPORTANT: There will be NO Visa Sponsorship available with these roles. Applications MUST be eligible to work in the Republic of Ireland (i.e. Dual Citizenship with an EU country).

Applications MUST include a CV.

Interested in these excellent opportunities? 

Submit CV today (in a Microsoft Word format) to: phil@allenrec.com

Contact Details:
Allen Recruitment
Tel: 016694055
Contact: Phil Finane

You may return to your current search results by clicking here.

This website uses cookies. Read our cookie policy for more information. By continuing to browse this site you are agreeing to our use of cookies.

Latest Job Listings