Skip directly to search

Skip directly to content

 

Cali

Data Engineer

Data
 
 

Responsibilities

  • Design and implement Data architectures on cloud or on-premises.
  • Develop logical data models for OLTP and OLAP processing.
  • Develop data pipelines for extracting, cleansing, transforming, and enhancing data.
  • Discovers and elicits business and data requirements.
  • Team-oriented attitude and the ability to work well with others to achieve a common goal.
  • Open to working in an agile environment as part of a scrum team.
  • The ability to take the initiative, drive the project and innovate.
  • Having a proactive attitude towards solving problems.
  • Good client-facing skills.

 

Qualifications and Experience

  • Experience with Python and pyspark.
  • Strong SQL and scripting experience.
  • Experience implementing ELT process and Data pipelines.
  • Experience in data modeling (normalized and multidimensional).
  • Background and experience with cloud data technologies and tools.
  • Familiar with distributed computing environments.
  • Familiar with data tools and technologies like:
    • BigQuery, Redshift, Snowflake, other Data warehouse tools
    • Spark, Hadoop, Apache beam, Dataproc or similar.
    • Real time pipelines with Kinesis or Kafka.
    • Batch processing.
    • Serverless processing.
  • Good leadership and communications skills.
  • English B1+

We are listening

How would you rate your experience with Endava so far?

We would appreciate talking to you about your feedback. Could you share with us your contact details?