IAM Hadoop Reporting SME
- Type Permanent
- Salary US$120000 - US$135000 per annum + competitive
- Location Piscataway, New Jersey
- Sectors Banking and Finance, IT, Banking IT, Data and Business Intelligence, Developer, Security, Network and Cloud
One of the largest investment banks in the world is seeking to add a full-time permanent IAM Hadoop Reporting SME to their team in Piscataway, New Jersey. You will work as the Senior Lead Developer and will be responsible for the design and development of BigData ETL workflows and data engineering to enable the implementation of reports for a variety of projects. You will be working across multiple departments and will work directly with stakeholders, developers, business analysts and testers in design and development.
This is a great opportunity to work on a new strategic data lake framework, which will form the foundation of the next-generation Compliance Transformations solutions.
- Develop, implement and support Hadoop BigData and Relational databases.
- Develop functional and technical specification documentation.
- Develop fully functional modules that meet all the specifications and have successfully passed all tests including unit, integration, regression and system tests.
- Coordinate with other onshore and off-shore developers, deliver working code.
- Coordinate with Business Analysts, Testers and Stakeholders.
- Participate in regular release processes, Agile ceremonies, delivering as per commitments during each iteration
- BS/MS degree in Computer Science, Engineering or a related subject
- 10+ years of software development experience.
- 8+ years of experience in design and implementing Data Warehouse and Reporting applications SSIS, SQL Server, Oracle or related databases.
- 5+ years of experience in design and implementing BigData Reporting and Analytics applications.
- 5+ years of experience in Hadoop technologies such as HDFS, YARN, Hive, HBase or Cassandra, Impala, Sqoop, Flume, Oozie.
- 4+ years of experience in Scala and Java programming.
- 2+ years of experience in Kafka and Apache Spark streaming.
- 4+ years of experience in Cloudera or HortonWorks distribution.
- Strong experience in UNIX Bash and Shell scripting.
- Strong analytical and technical design documentations skills.
- Strong troubleshooting, problem solving and performance tuning skills.
- Knowledge in any of the open source modern databases No SQL, Graph databases like Arango DB.
- Have good knowledge of Statistical modelling with R or Python programming.
Sthree US is acting as an Employment Agency in relation to this vacancy.