Data engineering - prototype DENG

(new)

Designing, building, operationalizing, securing and monitoring data pipelines and data stores.

Guidance notes

(new)

Activities include - but not limited to...

  • identifying data sources, processing concepts and methods
  • evaluating, designing and implementing on-premise, cloud based and hybrid data engineering solutions 
  • structuring and storing data for uses such as - but not limited to - analytics, machine learning, data mining, sharing with applications and organisations
  • harvesting structured and unstructured data
  • integrating, consolidating and cleansing data 
  • migrating and converting data
  • applying ethical principles in handling data 
  • building in security, compliance, scalability, efficiency, reliability, fidelity, flexibility and portability

Data engineering - prototype: Level 6

(new)

Leads the selection and development of data engineering methods, tools, techniques. Develops organisational policies, standards, and guidelines for the development and secure operation of data services and products. Ensures adherence to technical strategies and architectures. Plans and leads data engineering activities for strategic, large and complex programmes.

Data engineering - prototype: Level 5

(new)

Plans and drives data engineering solution development ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance. Contributes to organisational policies, standards, and guidelines for data engineering.

Data engineering - prototype: Level 4

(new)

Designs, implements and maintains complex data engineering solutions to acquire and prepare data. Creates and maintains data pipelines to connect data within and between data stores, applications and organisations. Carries out complex data quality checking and remediation.

Data engineering - prototype: Level 3

(new)

Designs and implements data pipelines and data stores to acquire and prepare data. Applies data engineering standards and tools to create and maintain data pipelines and extract transform and load data. Carries out routine data quality checks and remediation.