23 Jan
S.i. Systems
Toronto
Senior Java Developer with data engineering, Apache Flink, Beam and Kafka experience to work on a capital markets project with one of our major banking clients- 36062
Location Address: Hybrid - Toronto (1/week) - *Can be a remote role with strong candidates are not available in GTA but has to be in Canada and available in EST time zone*
Contract Duration: 6 months (Possibility of Extension)
Schedule Hours: 9am-5pm, Monday-Friday
Story Behind the Need
- Business group: Markets & enterprise technology - We are part of the Cross-Asset Engineering Team, who’s responsible of designing, developing and maintain the in-house built Front Office Blotter for ETF and derivatives desks in global wholesale banking.
- Project:
Regulatory Short Order Marking.
Candidate Value Proposition
- The successful candidate will have the opportunity to build out the Cerberus Cross-Asset Blotter, a front-office application for the trading desk. This application tracks trades and positions in real-time and consumes and publishes data from Kafka. For the Regulatory Short Order Marking project, we have to expand the applications coverage of orders and trade at a Bank level to track positions. The traders will use this information to properly mark orders as short. There will also be integration into upstream algorithmic and execution trading systems.
Typical Day in Role:
- Works closely with end-users, Business Analysts, and team members to understand business requirements that drive the analysis and design of quality technical solutions. Must take an interest in understanding the business functions of the end-users.
- Involved in the full systems life cycle and is responsible for designing, coding, testing, implementing and supporting application software that is delivered on time and within budget.
- Contributes to the design of new applications and undertakes enhancements. Largely working with creating net new code, the project works with decommissioning an aged application.
- Makes recommendations towards the development of new code or reuse of existing code.
- 100% back end
Must Have Skills:
- 10+ years: Core Java development and recent experience in Data engineering
- 5+ years: Apache Flink, Apache Beam, and Apache Kafka is a must
- Working experience using Maven and Git
- Foundational knowledge of SOLID principles
- Agile and Waterfall methodologies
Nice-To-Have Skills:
- Databases: MS-SQL and Oracle is an asset
- Additional CI/CD tools such as Bitbucket, Jenkins, Artifactory, and Docker is an asse
- 2+ years: Kubernetes, and Docker is a must
- Capital Markets experience
Degrees or certifications:
- bachelor’s degree in a technical field such as computer science, computer engineering or related field required
Best Vs Average:
- Data Engineer background and data processing experience, Capital Markets experience working with Equities and front office
Candidate Review & Selection
- Structure and Format:
- 1st round - Technical interview with Senior developer and Team Member - MS team - 60mins
- 2nd round - review Behavioral / fit - MS team - 30mins
Apply
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.