JOB DESCRIPTION:

  • Design and maintain data pipeline architecture,
  • Process large, complex data sets from various sources,
  • Build analytics tools that utilize data pipeline to provide actionable insights to the relevant stakeholders
  • Work with stakeholders to assist with data related technical issues and support their data infrastructure needs,
  • Other data related tasks, if needed.

 

REQUIREMENT:

  • Excellent English communication is a must
  • Strong knowledge working with files in Linux directories and SQL
  • At least 3 years of experience working as Senior Java Software Developers with a big plus in: Kafka, Spark, Storm, Hadoop, NoSQL.
  • Skills in architecture design a definite plus. Nice to have big data/streaming technologies. Additional Python/Perl would be a big benefit.
  • Comfortable working with engineers and other technical staff
  • Customer focused mentality
  • Assist engineers in the ETL process
  • Knowledge and understanding of vendor market data offerings from Thomson Reuters, Bloomberg, S&P Global, FactSet and other market data providers are a Plus