Posted : Friday, February 09, 2024 12:23 PM
ABOUT US
We’re like-minded, curious, excitable people here at Chiliz who work well in teams, spread across the globe.
Chiliz is a global blockchain company, which powers Socios.
com - the creators of Fan Tokens, and the popular fan rewards platform.
Socios has partnered with some of the world’s best teams, including Paris Saint-Germain, Juventus, FC Barcelona, Atlético de Madrid, UFC, Galatasaray, Manchester City FC, Davis Cup, and many more.
The curious nature of a Chilizen is what drives this company forward, and since we’re looking to grow even more, apply for your dream role today.
OUR BRANDS & CHANNELS We are building the web3 infrastructure for sports & entertainment! Founded in 2018, Chiliz is a blockchain provider focused on the sports and entertainment industry.
We build scalable, secure blockchain enabled solutions that supercharge fan experiences using digital assets.
$CHZ is the native digital token for the Chiliz sports & entertainment ecosystem currently powering Socios.
com and the Chiliz Chain blockchain.
Socios.
com is a fan engagement and rewards app that allows fans to engage with their favourite teams and clubs through digital assets known as Fan Tokens.
THE ROLE The Analytics Engineer will play a key role in designing, developing, and maintaining our data infrastructure, enabling the organization to extract valuable insights from diverse datasets.
The candidate will collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to ensure the seamless flow of data and facilitate data-driven decision-making.
Duties and Responsibilities: Data Modeling and Warehousing Create and optimize data models to support analytics and reporting requirements.
Implement and maintain data warehousing solutions to store and retrieve structured and unstructured data on AWS Redshift.
ETL Processes Build and optimize Extract, Transform, Load (ETL) processes for moving and transforming data between systems.
Implement data integration solutions to ensure a unified and coherent data ecosystem.
Data Pipeline Development Design, develop, and maintain scalable and efficient data pipelines for ingesting, processing, and transforming data from various sources with SQL and Python.
Automation and Orchestration Develop and implement automation scripts for data processing tasks.
Orchestrate workflows using tools such as Apache Airflow.
Version Control Utilize version control systems (e.
g.
, Git) for code collaboration and tracking changes.
Reporting and Visualisation Implementing business intelligence solutions by developing and maintaining reports and dashboards as per business owners’ requirements using Metabase and Tableau.
Requirements: 3+ years experience in a similar role.
Bachelor's degree or higher in a quantitative/technical field (e.
g.
Computer Science, Engineering or similar).
Great at communicating ideas and working in a team.
Advanced data warehousing experience on ETL work.
Working knowledge of DBT for Data Modeling.
Expert knowledge of SQL and experience in using cloud-based DWH, such as Redshift/Snowflake/Google Big Query.
Experience with building data pipeline processes and orchestration tools such as Airflow.
Solid proficiency experience with Python, developing data integration and analytical solutions.
Good understanding of cloud infrastructure platforms.
AWS experience is a plus.
Self-starter and detail-oriented.
You can complete projects with minimal supervision.
Desirable / Nice to have: Experience with Delta Lake infrastructures and large datasets processing with Apache Spark / PySpark.
Proven experience using dashboard and reporting tools such as Metabase, Tableau or Power BI.
Familiarity with AWS data services (Glue, EMR, Kinesis, Athena).
Knowledge designing CI/CD pipelines and IoC experience, preferably Terraform.
Previous exposure to REST APIs data integration.
WHAT WE OFFER We offer you the chance to grow, to learn, to flex your creative muscles and to work on a project that is providing excitement to thousands of users.
During our interview process you’ll be able to ask us anything and get to know your team too.
We need this to work both ways: It's not just about you fitting in, but about us being the right fit for you too.
Are you ready to work with the world’s best sports teams? Are you happy to try, fail and try again? Are you excited to keep pushing the boundaries of technology? We’ve got offices across the world, over 30 nationalities in our ranks and the most important superpower of all - flexibility.
Our competitive salaries, wellness allowance, healthcare and pension plan are just the tip of the iceberg.
You’ll gain friends, experience and a good challenge, we’ll gain you.
Are you ready?
Chiliz is a global blockchain company, which powers Socios.
com - the creators of Fan Tokens, and the popular fan rewards platform.
Socios has partnered with some of the world’s best teams, including Paris Saint-Germain, Juventus, FC Barcelona, Atlético de Madrid, UFC, Galatasaray, Manchester City FC, Davis Cup, and many more.
The curious nature of a Chilizen is what drives this company forward, and since we’re looking to grow even more, apply for your dream role today.
OUR BRANDS & CHANNELS We are building the web3 infrastructure for sports & entertainment! Founded in 2018, Chiliz is a blockchain provider focused on the sports and entertainment industry.
We build scalable, secure blockchain enabled solutions that supercharge fan experiences using digital assets.
$CHZ is the native digital token for the Chiliz sports & entertainment ecosystem currently powering Socios.
com and the Chiliz Chain blockchain.
Socios.
com is a fan engagement and rewards app that allows fans to engage with their favourite teams and clubs through digital assets known as Fan Tokens.
THE ROLE The Analytics Engineer will play a key role in designing, developing, and maintaining our data infrastructure, enabling the organization to extract valuable insights from diverse datasets.
The candidate will collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to ensure the seamless flow of data and facilitate data-driven decision-making.
Duties and Responsibilities: Data Modeling and Warehousing Create and optimize data models to support analytics and reporting requirements.
Implement and maintain data warehousing solutions to store and retrieve structured and unstructured data on AWS Redshift.
ETL Processes Build and optimize Extract, Transform, Load (ETL) processes for moving and transforming data between systems.
Implement data integration solutions to ensure a unified and coherent data ecosystem.
Data Pipeline Development Design, develop, and maintain scalable and efficient data pipelines for ingesting, processing, and transforming data from various sources with SQL and Python.
Automation and Orchestration Develop and implement automation scripts for data processing tasks.
Orchestrate workflows using tools such as Apache Airflow.
Version Control Utilize version control systems (e.
g.
, Git) for code collaboration and tracking changes.
Reporting and Visualisation Implementing business intelligence solutions by developing and maintaining reports and dashboards as per business owners’ requirements using Metabase and Tableau.
Requirements: 3+ years experience in a similar role.
Bachelor's degree or higher in a quantitative/technical field (e.
g.
Computer Science, Engineering or similar).
Great at communicating ideas and working in a team.
Advanced data warehousing experience on ETL work.
Working knowledge of DBT for Data Modeling.
Expert knowledge of SQL and experience in using cloud-based DWH, such as Redshift/Snowflake/Google Big Query.
Experience with building data pipeline processes and orchestration tools such as Airflow.
Solid proficiency experience with Python, developing data integration and analytical solutions.
Good understanding of cloud infrastructure platforms.
AWS experience is a plus.
Self-starter and detail-oriented.
You can complete projects with minimal supervision.
Desirable / Nice to have: Experience with Delta Lake infrastructures and large datasets processing with Apache Spark / PySpark.
Proven experience using dashboard and reporting tools such as Metabase, Tableau or Power BI.
Familiarity with AWS data services (Glue, EMR, Kinesis, Athena).
Knowledge designing CI/CD pipelines and IoC experience, preferably Terraform.
Previous exposure to REST APIs data integration.
WHAT WE OFFER We offer you the chance to grow, to learn, to flex your creative muscles and to work on a project that is providing excitement to thousands of users.
During our interview process you’ll be able to ask us anything and get to know your team too.
We need this to work both ways: It's not just about you fitting in, but about us being the right fit for you too.
Are you ready to work with the world’s best sports teams? Are you happy to try, fail and try again? Are you excited to keep pushing the boundaries of technology? We’ve got offices across the world, over 30 nationalities in our ranks and the most important superpower of all - flexibility.
Our competitive salaries, wellness allowance, healthcare and pension plan are just the tip of the iceberg.
You’ll gain friends, experience and a good challenge, we’ll gain you.
Are you ready?
• Phone : NA
• Location : Malta, MT
• Post ID: 9097751252