Price as low as $4499 | Learn to build custom large language model applications

Data Engineer Trainee

Remote | Pakistan

Full Time

About the role:

Are you passionate about building and optimizing data pipelines to drive AI and analytics solutions? If yes, we have an exciting opportunity for you at Data Science Dojo.

We are seeking a skilled and motivated Data Engineer to join our dynamic team. In this role, you will be configuring and managing components of big data pipelines, building and launching data extraction processes, and collaborating with cross-functional teams to meet diverse data requirements. Your work will play a critical role in facilitating AI automation and enhancing our data analysis capabilities.

What you will do:

  • Configuring components for big data pipelines to address diverse data science challenges.
  • Building and managing data pipelines for AI Automation, including data preprocessing tasks.
  • Creating robust data pipelines to facilitate data analysis and insights generation.
  • Contributing to product development using cutting-edge AI tools and cloud services.
  • Independently designing, building, and launching new data extraction, transformation, and loading processes in production.
  • Collaborating with cross-functional teams to meet data requirements and communicate insights effectively.
  • Designing and implementing data transfer solutions for various systems.

What we are looking for:

  • Pursuing or having completed a degree in Computer Science, Mathematics, Software Engineering, Computer Engineering, or a related field
  • Basic understanding of data engineering concepts: data modeling, ETL, data warehousing.
  • Familiarity with relational and NoSQL databases (MySQL, PostgreSQL, MongoDB).
  • Knowledge of SQL or similar query languages.
  • Awareness of cloud platforms and services for data engineering (AWS, Azure, Google Cloud).
  • Basic proficiency in Python, or any other programming language, for data manipulation and visualization (NumPy, Pandas, Matplotlib, Seaborn)
  • Knowledge of version control (Git).
  • Strong problem-solving and analytical thinking.
  • Robust organizational skills for multitasking and attention to detail; ensuring high-quality work.
  • Quick learning and adaptability to new technologies.
  • Willingness for further training and certifications, with a commitment to continuous learning in data engineering techniques and technology.
  • Effective communication of technical concepts and collaboration with team members.
  • Interest in data and technology and eagerness to apply data engineering skills to real-world problems.

Nice to have: 

  • Certifications in Azure Fundamentals (AZ-900), Azure Data Fundamentals (DP-900), or Azure Data Engineer Associate (DP-203) 
  • Knowledge of data processing technologies (Apache Spark, Hadoop, AWS Glue). 
  • Understanding of data visualization tools (Tableau, Power BI). 
  • Familiarity with data integration techniques and tools. 
  • Familiarity with workflow automation (Apache Airflow). 
  • Basic knowledge of containerization (Docker, Kubernetes). 
  • Knowledge of data serialization formats (JSON, Parquet, XML, Avro). 

Apply Now