
Senior Data Engineer/Machine Learning Engineer - Moldova
- Remote
- Chisinau, Chișinău, Moldova, Republic of
- Data Engineering
Job description
Yopeso has been developing a diverse range of software products, from large-scale applications to smaller solutions, for 20 years. With a growing team of over 250 employees across five locations, we are dedicated to fostering a culture of growth, transparency, and professionalism.
At Yopeso, we value authenticity, curiosity, and ambition. These values drive us to build strong connections within our community and with our partners, ensuring trust, integrity, and transparency in all our business practices. We strive to maintain the highest professional standards and continuously challenge ourselves to develop high-quality, high-performance, and secure software solutions.
Our approach is rooted in efficient collaboration among passionate professionals working in agile teams. Guided by curiosity and ambition, we strive to create products that are meaningful and impactful, while remaining true to our authentic selves.
What we offer:
Competitive remuneration
Remote work
Sports/leisure benefit
20 sick leave days paid at 100%
32 calendar days of vacation
Team events, online, at the office or outside
Professional development plan with guidance and mentorship
Training and development opportunities with allocated budget
Professional Certifications
Optional medical insurance
Job requirements
We are looking for a Senior Data Engineer (5+ years experience) with expertise in Google Cloud, Data Science, ML, AI prompt engineering, and dashboarding. You will design and optimize data pipelines, AI-driven workflows, and BI solutions while ensuring best practices in software engineering, CI/CD, and cloud infrastructure.
Key Responsibilities & Skills:
Google Cloud & Data Engineering:
Strong expertise in BigQuery, SQL optimization, and DBT.
Proficient in Kafka, Cloud Composer / Apache Airflow for orchestration.
Hands-on with IAM, Data Catalog, and Infrastructure as Code (Terraform/Deployment Manager)
Dataflow, Kubernetes (GKE), Vertex AI Pipelines, Kubeflow Pipelines.
Programming & DevOps:
Expert in Python (OOP, best practices, testing: Pytest, Unittest).
Experience with flake8, mypy, black, SonarQube, pre-commit hooks.
Proficient in Unix, Shell scripting, Docker, and CI/CD pipelines (GitHub Actions, Azure DevOps).
Strong Git skills: PR handling, conflict resolution, GitOps best practices.
Data Science & Machine Learning:
Hands-on with Pandas, NumPy, Scikit-learn, TensorFlow.
Strong in feature engineering, data preprocessing, model training & deployment.
Experience with MLOps automation and ML models in production.
Knowledge of AI prompt engineering for optimizing LLM interactions.
Dashboarding & Data Visualization:
Proficient in Power BI, Tableau, Looker for interactive dashboards.
Experience integrating BigQuery or cloud databases into BI tools.
Strong data modeling and visualization skills
Soft Skills:
Proactive, problem-solving mindset with strong analytical skills.
Effective communication with stakeholders, data scientists, and engineers.
Passion for code quality, code reviews, and best practices.
Fluency in English
Nice to have: Google data Engineering skills , familiar with MLOps , Google vertex AI pipeline , CI/CD processes
or
All done!
Your application has been successfully submitted!