BIG DATA ARCHITECT

    0
    162

    Intelletec
    Big Data Architect ( Kafka / Hadoop / Nifi ) 6 Months+ £600.00 – £700.00 per day (dependent on experience) Central London A Big Data Architect ( Kafka / Hadoop / Nifi ) is required t Key Responsibilities – Help to design and develop the new real time data platform – Work to conceptualise, evaluate and build proof of concepts on new models, tools and techniques – Work alongside other team members, acting as a mentor when needed and sharing key skills – Monitor and tune performance of the system – Create design patterns, frameworks and standards on the codebase and infrastructure Essential Skills – Expert and professional knowledge of core Hadoop distribution stack: Hadoop, V2, HDFS, Spark, Pig, Hive, Impala and ZooKeeper – understanding of distributed computing principles – Expert knowledge in Java/Scala programming languages – Expert knowledge in handling stream data including Kafka, Kafka Streams, Spark Streaming and Storm – Expert knowledge in messaging platforms, including: Kafka, Apache Nifi, Flume, Scoop and RabbitMq – An expert understanding of data integration – A sufficient understanding of relational databases, including: Oracle, SQLServer and MySQL – A sufficient understanding of NoSQL databases, including: HBase, Cassandra and MongoDB – A sufficient knowledge of data model design techniques, including: OLTP and Dimensional Modelling – Some experience with popular distributions such as Cloudera, MapR and HortonworksIntelletec
    Big Data Architect ( Kafka / Hadoop / Nifi ) 6 Months+ £600.00 – £700.00 per day (dependent on experience)
    Central London A Big Data Architect ( Kafka / Hadoop / Nifi ) is required t Key Responsibilities – Help to design and develop the new real time data platform – Work to conceptualise, evaluate and build proof of concepts on new models, tools and techniques – Work alongside other team members, acting as a mentor when needed and sharing key skills – Monitor and tune performance of the system – Create design patterns, frameworks and standards on the codebase and infrastructure Essential Skills – Expert and professional knowledge of core Hadoop distribution stack: Hadoop, V2, HDFS, Spark, Pig, Hive, Impala and ZooKeeper – understanding of distributed computing principles – Expert knowledge in Java/Scala programming languages – Expert knowledge in handling stream data including Kafka, Kafka Streams, Spark Streaming and Storm – Expert knowledge in messaging platforms, including: Kafka, Apache Nifi, Flume, Scoop and RabbitMq – An expert understanding of data integration – A sufficient understanding of relational databases, including: Oracle, SQLServer and MySQL – A sufficient understanding of NoSQL databases, including: HBase, Cassandra and MongoDB – A sufficient knowledge of data model design techniques, including: OLTP and Dimensional Modelling – Some experience with popular distributions such as Cloudera, MapR and HortonworksRead More