Skip to content

Senior Data Engineer/Machine Learning Engineer

  • Remote
    • Cluj-Napoca, Cluj, Romania
  • Data Engineering

Job description

Yopeso has been developing a diverse range of software products, from large-scale applications to smaller solutions, for 19 years. With a growing team of over 250 employees across five locations, we are dedicated to fostering a culture of growth, transparency, and professionalism.

At Yopeso, we value authenticity, curiosity, and ambition. These values drive us to build strong connections within our community and with our partners, ensuring trust, integrity, and transparency in all our business practices. We strive to maintain the highest professional standards and continuously challenge ourselves to develop high-quality, high-performance, and secure software solutions.

Our approach is rooted in efficient collaboration among passionate professionals working in agile teams. Guided by curiosity and ambition, we strive to create products that are meaningful and impactful, while remaining true to our authentic selves.

What we offer:

  • Competitive remuneration

  • Remote work

  • 24 days off per year and floating days

  • Private clinic health services Regina Maria Medical Insurance

  • Flexible benefits through Up multibenefits platform

  • Referral bonus scheme

  • Team events, online or at the office

  • Training and development opportunities with allocated budget

  • Professional Certifications

  • Knowledge sharing context

Job requirements

We are looking for a Senior Data Engineer (5+ years experience) with expertise in Google Cloud, Data Science, ML, AI prompt engineering, and dashboarding. You will design and optimize data pipelines, AI-driven workflows, and BI solutions while ensuring best practices in software engineering, CI/CD, and cloud infrastructure.

Key Responsibilities & Skills:

  • Google Cloud & Data Engineering:

  • Strong expertise in BigQuery, SQL optimization, and DBT.

  • Proficient in Kafka, Cloud Composer / Apache Airflow for orchestration.

  • Hands-on with IAM, Data Catalog, and Infrastructure as Code (Terraform/Deployment Manager)

  • Dataflow, Kubernetes (GKE), Vertex AI Pipelines, Kubeflow Pipelines.

Programming & DevOps:

  • Expert in Python (OOP, best practices, testing: Pytest, Unittest).

  • Experience with flake8, mypy, black, SonarQube, pre-commit hooks.

  • Proficient in Unix, Shell scripting, Docker, and CI/CD pipelines (GitHub Actions, Azure DevOps).

  • Strong Git skills: PR handling, conflict resolution, GitOps best practices.

Data Science & Machine Learning:

  • Hands-on with Pandas, NumPy, Scikit-learn, TensorFlow.

  • Strong in feature engineering, data preprocessing, model training & deployment.

  • Experience with MLOps automation and ML models in production.

  • Knowledge of AI prompt engineering for optimizing LLM interactions.

Dashboarding & Data Visualization:

  • Proficient in Power BI, Tableau, Looker for interactive dashboards.

  • Experience integrating BigQuery or cloud databases into BI tools.

  • Strong data modeling and visualization skills

Soft Skills:

  • Proactive, problem-solving mindset with strong analytical skills.

  • Effective communication with stakeholders, data scientists, and engineers.

  • Passion for code quality, code reviews, and best practices.

  • Fluency in English

Nice to have: Google data Engineering skills , familiar with MLOps , Google vertex AI pipeline , CI/CD processes

or