Data Specialist

November 5, 2024

  • Full Time
  • Gothenburg
  • Considering candidates only who are inside Sweden 🇸🇪 and across Europe 🇪🇺 who are willing to relocate.

We are looking for a senior Data Specialist. Our focus is to enhance efficiency in crafting data products through a streamlined journey leveraging our platform capabilities. Our mission is to deliver our business goals with general-purpose data engineering solutions and methodologies, aligning best practices from the company and the industry. You can work in an ever-evolving agile way of working where you will be part of an empowered product team that strives for continuous learning and improvement. Provide support, guidelines and best practices on creating and operating data products. Creating and maintaining data products based on business requirements. You will collaborate with your Product Owner to ensure the right functionality is being developed and the work package can be broken down into tangible items with complexity estimation to be able to plan upcoming sprints. You will determine and maintain the technical design in our area within the given architectural guidelines. Interacting with stakeholders and System Architect is an equally important part of your daily work.

Requirements:

  • Python.
  • Apache Spark.
  • Bash.
  • Table formats (Delta).
  • Cloud Platform technology (Azure).
  • Docker.
  • Data Platforms (Databricks/Snowflake).
  • Github.
  • Familiar with Agile methods.
  • Fluent in English, both verbal and written.

Required Skills

  • Deep knowledge and passion for modern data architecture principles.
  • Experience in design, development and operation of data pipelines.
  • Extensive hands-on experience as a Data Engineer in an agile environment.
  • Comfortable working independently and collaboratively.
  • Pragmatic approach, understanding the trade-offs between the perfect solution and a working solution.
  • Passion for improving code quality as well as data quality.
  • Knowledgeable about data modeling, data access, and data storage techniques.
  • Experience with cloud technologies (Azure and AWS).
  • Experience in applying data privacy and protection methods/tools.
  • Additionally, knowledge in Software Engineering, System Design, SQL, Orchestration Tools (Airflow/Prefect/Dagster), Data Streaming (Apache Kafka), CI/CD toolchain, IaC, GitOps, DevOps, and Networking is essential.
  • Data is integral to everything we do. Therefore, we expect a certain level of data acumen and any experience working with the complexities surrounding data is considered useful.
  • Applying product thinking in working with data.
  • Data governance.
  • Modern data architecture such as data mesh.
  • Experience in creating reports for conveying business information or insights.

 

Maximum file size: 256 MB.