Data Engineer - Brisbane

Company: Confidential
Your Application: You have not applied yet
Location: Brisbane, Australia
JOB DESCRIPTION

Are you a Data Engineer with strong Azure or AWS experience- If the answer is yes, this large Financial Services Confidential is looking for you

You will help build out a new data platform, the foundations of which have been built ed fleshing outhe platform is in the Microsoft stack, including Azure, with a whole host of other technology utilised to help advance it as the company moves their data into a future statep>

Your responsibilities will include:

Manage data pipelines
Drive Automation through effective metadata Confidential
Learning applying modern data preparation, integration -enabled metadata Confidential tools chniques:
Tracking data consumption patterns
Performing intelligent sampling ching
Monitoring schema changes
Recommending or sometimes even automating existing ture integration flows
Collaborate across departments with data science teams th business data analysts in refining their data requirements for various data alytics initiatives eir data consumption requirements
Educate ain
Participate in ensuring compliance vernance during data use
Be a data alytics evangelist

You should have a good mix of the following skills perience appreciate this list is lengthy:

Experience with advanced analytics tools for object-oriented/object function scripting using languages such as R, Python, Java
Strong ability to design, build nage data pipelines for data structures encompassing data transformation, data models, schemas, metadata rkload Confidential
Experience
with popular database programming languages including SQL, PL/SQL, others for relational databases rtifications on upcoming NoSQL/Hadoop oriented databases like MongoDB, , others for non-relational databases
Experience in working with large, heterogeneous datasets in building timizing data pipelines, pipeline architectures tegrated datasets using traditional data integration technologies
Experience in working with timizing existing ETL processes ta integration ta preparation flows lping to move them in production
Experience in working with both open-source mmercial message queuing technologies such as Kafka, JMS, Azure Service Bus hers, stream data integration alytics technologies such as Data Bricks hers
Experience working with popular data discovery, analytics software tools like Tableau, Qlik, PowerBI
Experience in working with data science teams in refining timizing data science chine learning models gorithms
Demonstrated success in working with large, heterogeneous datasets to extract business value using popular data preparation tools such as Trifacta, Paxata, Unifi
Basic experience in working with data governance/data quality ta security teams
Demonstrated ability to work across multiple deployment environments including cloud, on-premises brid, multiple operating systems
Adept in agile methodologies pable of applying DevOps creasingly DataOps principles to data pipelines



JOB TYPE
Work Day: Full Time
Employment type: Permanent Job
Salary: Negotiable


JOB REQUIREMENTS
Minimal experience: No experience



Jobs you may be interested in






    Tips You May Be Interested In