Data Engineer (100% Remote, Project role up to 6 months)
Lykke
- Zug
- Freiberuflich
- Vollzeit
- Design, construct, install, test, and maintain highly scalable data management systems.
- Ensure systems meet business requirements and industry practices.
- Integrate new data management technologies and software engineering tools into existing structures.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Develop set processes for data mining, data modeling, and data production.
- Collaborate with data architects, quants, modelers, and IT team members on project goals.
- Ensure data privacy and compliance with data protection regulations.
- Bachelor's or Master's degree in computer science, information technology, engineering, or a related field.
- Proven experience as a Data Engineer, Software Developer, or similar role.
- Proficiency in languages used for scripting and data science: Python, Java, Scala.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Candidates with prior experience in developing and implementing trading algorithms within the banking, fintech, or cryptocurrency exchange sectors will be highly valued.
- Proficiency in quantitative analysis, algorithmic trading strategies, and risk management techniques. Previous exposure to real-time market data feeds, order execution platforms, and trading APIs is advantageous.
- Familiarity with high-frequency trading (HFT) systems, algorithmic trading libraries, and backtesting frameworks is desirable.
- Applicants possessing a deep understanding of market microstructure, liquidity dynamics, and order routing protocols will be prioritized for their ability to design resilient and scalable data solutions tailored to algorithmic trading requirements.