Big Data DevOps Engineer
Want to join a cutting-edge engineering team working on a product line that now requires big data analytical solutions to interpret, collate and analyse the masses of data it’s producing?
• You’ll be designing and building real-time big data scalable solutions with fault-tolerance and high availability
• You’ll be developing new data analytics modules
• You’ll be creating Data Pipelines and managing them.
• You must be eligible to live and work in the EU on application
• You must have been working in Linux or UNIX for at least 7 years
• Proven expertise in Java, bash and Perl or python
• Proven expertise in designing large, distributed big data architectures
• Exposure to some or all of Scala, VMware, Docker is advantageous
• Deep knowledge of hypervisors, networking, physical machines and storage solutions
• Expertise with as many as possible of the following: HBase, Hive, HDFS, Kafka, Spark, Storm, Yarn, Splunk, etc.
• Expertise in interactive, batch and real-time/streaming processing
• Educated to a degree level in Comp Sci, Comp Eng or a related discipline