Data Analyst & BI Reporting Developer (ETL)

Register for Job Alert
Permanent
Technology
West Midlands
Work from home
Negotiable, dependent upon experience
ETL_0721

Data Analyst & BI Reporting Developer

Extract, Transform & Load

UK wide - work from home – remote working


@mecscomms
is recruiting for a remote based, home working – Data Analyst & Reporting Insight Developer to support Business Intelligence (BI) initiatives for a global IT, ICT, Technology, Cloud, & Communication managed service provider. The Data Analyst & Reports Developer will manage the extraction, analysis & cleansing of high volumes of complex data from multiple legacy systems, in preparation of a major systems migration. You’ll also support automated intelligence reporting, data analysis, BI reporting & future integrations. I’m keen to hear from any SQL data analysts with extensive ETL experience.

 

Position:          Business Intelligence, Big Data, Data Analyst,Data Manipulation, Insight Developer, Reports Developer, Business Analyst, Analyst Programmer, Application Developer, Automation, Data Integrity, Data Modelling, Reporting

 

Purpose:          Manage the extraction, analysis & cleansing of high volumes of complex data from multiple legacy systems, in preparation of a major systems migration. Support automated intelligence reporting, data analysis, BI reporting & future integrations. Program, develop, test & implement automation reports.

 

Location:          Work from home – remote working - anywhere UK

 

Duration:          Full time, permanent employment

 

Salary:             circa £50,000 plus benefits

 

Environment: IT, ICT, Technology, Cloud, Hosting, Managed IT Services, Virtualisation, Systems / Network Integrator, Telecom, ISP, Microsoft Technology Partner, Consultancy, Unified Communications, Collaboration, Business Intelligence, Big Data, Data Analyst,Data Manipulation, Insight Developer, Reports Developer, Business Analyst, Analyst Programmer, Application Developer, Automation, Data Integrity, Data Modelling, Data Migration, Reporting, ETL Scripting, Shell Scripting, BI Reports, Databases & SQL queries, Real time data ingestion, Kafka, Java., ASP.net, Power BI, Microsoft Dynamics, Machine learning & ServiceNow

 

Key Activity:

 

  • Database manipulation
  • SQL Queries
  • Complex ETL scripting  
  • Development of reporting tools & dashboards
  • Data analysis & interpretation
  • Data cleansing & data modelling
  • Business analysis & reporting
  • Report automation development 
  • Insight process management
  • Governance & Control

 

Overview:

 

The role will support the extraction of data from legacy systems, analysing & cleaning it in preparation for being loaded in to new target platforms. You’ll also focus on business-as-usual data analysis, reporting, cleansing, & will identify areas of performance improvement through the development & deployment of a robust suite of automated reporting tools including interactive & customised reports & dashboards.

 

You’ll play a pivotal role in providing Business Intelligence through developing reporting tools & the extraction, transformation & loading of data into business management systems.  

 

Responsibilities:

 

  • Conduct business analysis to identify needs & reporting requirements

 

  • Produce technical specification documents for reporting, ETL activity & insight tools

 

  • Consume & manage large volumes of complex data from multiple systems
  • Analyse data for quality, consistency & accuracy

 

  • Preparing data transformation sets ready for input/upload to new target applications

 

  • ETL & data cleansing

 

  • Support the extraction of data from legacy systems, analysing, & cleaning it

 

  • Collaborates with transformation, IT & business teams to implement data migration solutions

 

  • Gets under the skin of data & processes to understand its value

 

  • Document, describe, review, & refine source-to-target mappings with migration teams & stakeholders

 

  • Implement ETL scripts to prepare legacy data for ingestion to new target environment.

 

  • Write ETL scripts & code to ensure high performance data transformation

 

  • Re-engineer manual data flows to enable scaling & repeatable use

 

  • Develop repeatable BI reports

 

  • Reengineer manual data flows to enable scaling & repeatable use

 

  • Act as part of an agile team migrating products & services from source systems into new target architecture.

 

Candidate profile:

 

Candidates must have previous & similar experience of designing, developing, delivering & maintaining large-scale data infrastructure environments. You’re likely to have worked within a large enterprise, Software Vendor, Managed Hosting, Cloud, Systems / Network Integrator or similar type of managed technology service provider. Your experience is likely to include as much of the following as possible:

 

  • Large databases environments

 

  • Extensive SQL queries

 

  • Extensive ETL experience

 

  • Strong shell scripting

 

  • Real time data ingestion

 

  • Apache Kafka or Confluent Cloud

 

  • Power BI

 

  • Microsoft Dynamics

 

  • Excellent data modelling

 

  • Development language knowledge (Java/ASP.net/other)

 

  • Data science

 

  • Familiarity with machine learning & ServiceNow

 

  • Software engineering best practices

 

  • Agile Software Development methodologies

 

  • Process oriented with great documentation skills

 

Can't find the job you're looking for?

Complete this short form & submit your CV then we will do the rest

(Permitted file size is 5Mb and file types are: doc, docx, txt, pdf, rtf, xls)
Please note this website process the data you enter here under legitimate interest and none of your data is stored on this site. To review our privacy policy please use this link Privacy.

Attach CV*