← Back to Careers

Data Engineer (Mid-Level)

Sydney/Hybrid - Full Time Employee

Apply Now

About Us

Position Summary 

What You’ll Do

Required Skills

Unleash live is a world-leading AI video analytics platform revolutionizing the way organizations in infrastructure, renewables, cities, and transport harness insights from visual data. Our cutting-edge AI technology transforms any video feed into real-time, actionable insights, empowering customers to make smarter, faster decisions.

Unleash live is expanding, with a focus on the Australian and US markets. Our team is globally distributed across Sydney, Poland, and in the US, creating a collaborative and culturally diverse environment. If you're passionate about innovation and thrive in a fast-paced, dynamic setting, Unleash live offers unparalleled access to career growth, multi-functional experience, and direct interaction with senior leadership.

With the high volume and velocity data generated by constantly running AI driven Video Analytics, Unleash live creates vast amounts of opportunities to uncover hidden insights from data otherwise not easily seen. We pride ourselves in having a strong data pipeline supporting our platform to enable our clients to gain extremely valuable insights that is only possible through analyzing data produced by our Video Analytics Platform.

If you have a relentless pursuit of enabling highly valuable business insights through the discovery and generation of high quality data while working in a fast paced, innovative, diverse and technology-driven environment, then this role might be for you.

Strong knowledge of AWS services (AWS Glue, Athena, Redshift, DynamoDB, SageMaker, etc.) and comfortable working with cloud-based resources.

  • Skilled in both SQL and NoSQL databases (SQL, Redshift, MongoDB, DynamoDB, PostGres, etc.)

  • Proficient experience in adopting a Medallion Architecture

  • Skilled in doing distributed computing using Apache Spark

  • Experienced in building data transformation pipelines and performing feature engineering on unstructured/semi-structured data

  • Working knowledge of RAGs, MCPs and the use of VectorDBs

  • Proven track record working with large datasets, ML inference workflows, and performance analysis.

  • Familiarity with tools like Airflow, dbt, Kafka, or equivalent orchestration and ETL frameworks.

  • Working knowledge of building/working with REST APIs

  • Expert Python programmer and codes using software engineering best practices

You Will

  • Continuously design, build and maintain the following:

    • End-to-end data pipeline solutions (from ingestion to serving)

    • Database solutions for both data lake and warehouse

    • Data transformation jobs as and when required

  • Defining and enforcing data quality standards across the business

  • Ensure high data quality, reproducibility, and pipeline reliability

  • Heavily utilise AWS cloud resources to produce solutions and to scale and monitor data workflows

  • Collaborate between the engineering and AI teams to unlock blockers and enable results

  • Document solutions and diagram architectures

You Are

  • A highly professional, committed, accountable and adaptable individual

  • Very results-oriented and have a “you see it, you own it” approach

  • Very high attention to detail and rigorous in enforcing quality standards

  • Able to work independently and cross-functionally with other members of the team

  • Comfortable in communicating both good and bad news to all stakeholders

  • Able to produce efficient, robust and scalable solutions

  • Passionate about cutting-edge tech within the data engineering space

Nice to Have

  • Expert-level experience in building data pipelines

  • Proficient-level expertise in building solutions using AWS

  • Skilled in using Databricks, Snowflake, or other data intelligence platforms

  • Skilled in using Infrastructure as Code (Iac) tools, i.e. Terraform, SLS, CloudFormation, etc.

  • Adept in building ML-related data pipelines

  • Working knowledge of APIs, data science / Analytics and computer vision concepts

  • Proficient in more than one programming language (i.e. Python, bash, JavaScript, c/c++, Ruby, Go, etc.)

Apply Now

Please enter your details and attach your resume to the form below to apply now.