Data Engineer

Check out the general attributes.

Since our establishment in 2013, we have been managing all the processes required by a mobile application under same roof, investing in startups with global growth potential and enabling them to reach audiences. With nearly 150 B2C products we have developed to date, we simplify the daily lives of individuals and offer them unique experiences. Utilizing our technology and marketing know-how, we create individual and corporate business models.

As one of the leading companies in the IT industry, we shape the industry and carry out creative and innovative projects in line with the requirements of the digital age. We touch the lives of more than 3 billion users with our mobile applications and expand our impact by transforming our in-house solutions into SaaS products. With our Rockads, VerifyKit, Desk360 and Zotlo services, we offer businesses solutions that allow them to manage their business on a global scale effectively.

Apply now if you want to join the strong and ever-growing Teknasyon team and become part of this story! You can be the “Data Engineer” we are looking for!


  • Strong software engineering background,
  • Minimum 3 years of experience,
  • Python and/or Scala experience on real projects,
  • Hands-on experience with at least some of these big data technologies: Spark, Hadoop, Hive, Presto, Cloudera,
  • Experience with Stream processing (e.g., Kafka, Kinesis, Pub/Sub, Spark Streaming, Dataflow, Apache Flink),
  • Experience with ETL processes,
  • Experience with at least one of SQL databases like MySQL, PostgreSQL,
  • Experience with NoSQL databases like Amazon DynamoDB, Cassandra, or MongoDB,
  • Experience in cloud platforms (Amazon Web Services or Google Cloud Platform) with related tools such as Dataflow, BigQuery, EMR, Redshift, Athena, etc.
  • Building new data pipelines in GCP or AWS environment,
  • Familiarity with the Linux shell environment,
  • Knowledge on basics of DWH and exposure to BI projects,
  • Strong teamwork and communication skills, solution-oriented attitude,
  • Detail-focused, careful, analytical thinking ability,
  • High motivation for innovation,


  • Designing data engineering infrastructures, pipelines, data lakes, cloud architectures for ML/Data applications,
  • Bulk and streaming data pipelines and analytic solutions using tools such as Kubernetes, Kafka, Dataflow, Big Query, Airflow,
  • Design and develop data model for Data Scientists,
  • Ensure best performance for ETL by identifying needed indexes, partitions, materialized views and so forth,
  • Modify the physical data model for efficient data access paths,
  • Translate business requirements into data models that are easy to understand and used by different disciplines across the company,
  • Design, develop, and maintain scalable & reusable streaming or batch data pipelines for analysis, reporting, optimization, data collection, management and usage,
  • Ensure security, privacy and compliance for all data assets,


  • Being part of a big team who makes global apps,
  • A team that values open communication, multidimensional leadership and teamwork,
  • A dynamic work environment which relies on collaboration,
  • Never-ending opportunities for learning and improvement,
  • Internal, external and online training,
  • An office full of joy and activities,
  • Monthly social events organized to lift the team spirit,
  • Bonus pay when your colleague thanks you,
  • Bonus pay to give a reference for your friend to be hired.