Data Analyst

Hungary, Macedonia, Romania, Turkey, Serbia, Croatia, Bosnia and Herzegovina, Serbia (Hybrid)

We are looking for new engineering colleagues who are eager to push boundaries and achieve new milestones every day. We need a passionate professional who thrives on building robust data platforms, uncovering business insights, crafting compelling narratives, and bringing data-driven products to life. You will work with a wide range of concepts and architectural patterns. This special TEO team, A team, is an example of a true consulting mode of work. The ideal candidate should have experience across multiple industries but will primarily contribute to the TEO team by leveraging their expertise in technology across these industries. 

Your data analysis responsibilities will focus on interpreting data, automating processes, visualizing insights, collaborating with teams to optimize data pipelines, ensuring data quality, and providing actionable insights to support data-driven decision-making across the organization. 


Key Responsibilites:

  • Focusing on data interpretation, automation, and visualizing insights 

  • Collaborate with external data sources or third-party APIs to integrate data for analysis and reporting 

  • Assist in designing ETL pipelines that extract data, transform it for analysis, and load it into the data warehouse or reporting tools 

  • Optimize data processing tasks for better reporting performance and faster insights. 

  • Work with data engineers to understand the data pipeline architecture and assist in shaping it to ensure smooth data flow for reporting and analysis. 

  • Provide feedback and insights on structuring data lakes for easy access and analysis 

  • While the primary responsibility would be focused on analysis, work with teams to ensure that reports and dashboards can handle large data volumes efficiently. 

  • Analyze data performance metrics and make recommendations for system optimizations based on reporting needs 

  • Conduct in-depth analysis of data, identify trends, and help generate actionable insights for business teams. 

  • Apply statistical methods and models to derive insights from data, assisting in forecasting, pattern detection, and decision-making. 

  • Take ownership of features and code quality  

  • Always use the latest technology available on the market and follow industry best practices   

  • Provides suggestions for optimization and performance improvement of business processes with an emphasis on the useful value of data  

  • Design and implement solutions that depend on diverse data sources Familiar with industry standards such as HIPAA, HITRUST, PCI-DSS, GDPR, SOC2, ISO-27001.  

  • Implementing data governance policies to ensure compliance and secure data management practices  

  • Design and implement dynamic data parsing, transformation and storage systems  

  • Understand and advocate the importance of high data accuracy throughout the system  

  • Proactively adapts to new tools that significantly improve efficiency and takes care to apply (where possible) AI practices in the analytics domain 

 

Required Qualifications:

  • Experience with at least one of the following: Scala, Java, or Python (preferably more than one of them) 

  • Proficiency in SQL 

  • Deep experience (preferably certified) in Databricks Analytics or Snowflake Analytics 

  • Experience with leading cloud providers, including AWS, GCP, or Azure 

  • Hands-on experience with Google analytics, Google search console, Matomo and similar 

  • Experience with Power BI, Qlik, Looker, Tableau or similar 

  • Strong expertise in data modeling, data storytelling, dashboarding and business intelligence 

  • Experience with cloud computing and serverless paradigms  

  • Experience with building data processing pipelines and complex workflows  

  • Experience with Version Control Systems (Git, SVN)  

  • Data-specific algorithm design (search, clustering, data extraction, data transformation)  

  • Must be able to both develop new algorithms theoretically and implement and optimize its performance in Python  

  • Relational databases, object databases, graph databases, document databases  

  • Unstructured to structured data transformation  

  • Web scraping and schema extraction  

  • English language proficiency

 

Nice to have: 

  • Experience with distributed environments  

  • Experience with CQRS and event sourcing approaches 

  • Experience with virtualization and containerized applications (Docker, Kubernetes)  

  • Experience with SAS Analytics 

  • A desire to build valuable data assets and help business decision-makers

  • Is enthusiastic about AI, large language models (LLM - GPT, o1, Claude, Llama, other architectures), and building agentic systems  

  • Able to work on recent problems and devise creative, non-standard solutions  

  • It thrives through ideation, R&D and product development

 

Data Analyst

Job description

Data Analyst

Personal information